🇨🇦🇩🇪🇨🇳张殿李🇨🇳🇩🇪🇨🇦

My Dearest Sinophobes:

Your knee-jerk downvoting of anything that features any hint of Chinese content doesn’t hurt my feelings. It just makes me point an laugh, Nelson Muntz style as you demonstrate time and again just how weak American snowflake culture really is.

Hugs & Kisses, 张殿李

  • 25 Posts
  • 207 Comments
Joined 1 year ago
cake
Cake day: November 14th, 2023

help-circle




  • I have compared several more traditional translation engines (Google Translate, Baidu Translate, Bing Translate, DeepL, etc.) vs. several LLM-based translation engines (DeepSeek, Perplexity, and ChatGPT).

    There is a HUGE difference in quality. Like you can’t even compare them. The latter do far more idiomatic translation than do the former and the quality of the output is higher and more directly usable.

    But …

    You absolutely must do a back-translation check to ensure that it didn’t hallucinate something into your translation. Take your document in A and have it translate into B. Then start a new session, take that translated document B and translate it back to A. Also tell it to analyze B for possible translation errors, unclear areas, etc. If it comes back with nothing more than nit-picky suggestions you’re fine. If it translates back stuff with hallucinated content or serious grammatical errors, etc. try again.

    It’s still faster than and far higher quality than Google/Baidu/Bing/DeepL translation, even with the extra checking step.

    Translation is one of the few places I’ll say LLMs have value, though if you trust it you absolutely will get burned. You need to check its output.





  • Not like you’d normally put that much work into hand-made animation that is not specifically made to showcase what animation CAN look like.

    Have you never watched an animated film? And I don’t mean those Saturday morning cartoon things. I mean actual long-form film made for adults.

    Animators put insane amount of work into details and subtleties when animating to sell the immersion.

    Idk, seems like it would be an easy fix to tell the AI to generate more in-betweens to make the animation smoother.

    Then you’d just get more random in-betweens. The issue is that the degenerative AI has literally no idea what it’s making. It doesn’t know about legs or arms or eyes or noses or whatever. It has no internal model guiding it in any way, so asking it to generate more frames is just going to get you more frames of arms and legs jerking around in disturbing, incoherent ways.








  • The Chinese market is huge, yes, but increasingly turning away from Hollywood productions to homegrown ones. In 2025 for example 哪吒2 (Nézhā 2) broke scored over $2 billion at the box office, with a record-smashing $1.96 billion of that coming domestically. By way of comparison Captain America 4 only managed $14.4 million so far, a dramatic drop from 2016’s Captain America 3 returns of $180 million in 2016.

    For reference, even CA3’s $180 million is an order of magnitude smaller than Nezha 2. CA4’s is two orders of magnitude smaller.

    Now this is still true: China’s theatre-going audience, estimated at over half a billion people, is larger than the entire population of the USA. It’s still a hugely important market. But, for example, in 2024 the Chinese box office was estimated at ~6 billion dollars total: and 80% of that went to domestic films. The best-performing foreign film of 2024 (Dune 2) only made $48 million, ranking it about 8th. 7th was 维和防暴队 (Wéihé Fángbàoduì/Formed Police Unit) and it made over $120 million.

    I’m pretty sure that the Chinese market for Hollywood films is vanishing.