At some point AI will get good enough to produce an actual good code. But that's probably not going to happen in the next 5 years. Most AI or rather more accurately LLMs are already running out of training data, before chatGPT 3.0 came out most coding data floating around the internet are REAL hand crafted codes, that's why we got the jump from no LLM to gpt3 and Claude 3.5 was massive because it was trained on real prime quality data. Nowadays the code that's on the internet are AI generated slop, and LLM is a garbage in garbage out machine. The best analogy for it is "Incest, you can only marry your second cousin so much untill your entire family tree becomes fcked up" Unless they come up with an entirely new algorithm (like radically different than what we have today) the generative AI industry will experience singularity of AI slop.
Number 2 is that LLM is a prediction machine, and the way it predicts probability is by looking at the most often word that came after certain word in their dataset. This is why most code that's spew out by AI slop are JS and TS because they're the most common programming languages because these are what college and boot camp and millions of YouTube tutorial teach you. But these languages are trash in performance, there's a reason why most high performance servers don't run on typescript, they run on GO, Java, Elixir, rust, Erlang. There's a reaso why your firmware is not written in embedded TS, they use C or assembly instead. The point is that if most codes in the internet are shit then the AI can only produce shit.