Pocketpair, the developer of Palword, revealed that its newly created publishing arm will not publish games with AI and NFTs claiming it doesn’t believe in it.This Substack is reader-supported.
AI wouldn't have been able to create the game period. My Dad read stories of AI that was tasked to write a long string of code and the AI got bored and started improvising on writing the code and the result was code that was as buggy as the day was long, and then the AI lied that it had produced quality code but then when asked if it lied, the AI said that itnhad lied. AI was also asked to go through like 200 background checks to see if this guy was listed in them and AI got bored in the middle of this long task and lied and said that the guy it was tasked to look for in the background checks wasn't listed there even though the guy was after actual humans double-checked. Give AI short tasks and it will get it right. Give AI long arduous tasks and it will get bored, quit, and then lie that it actually did the task properly even though it did not do what it was told.
Run it how you want. But how long untill we hear "we have allways supported AI"
The people known for "Pokemon ripoff" not supporting generative AI is peak irony. But very fitting, I suppose.
AI wouldn't have been able to create the game period. My Dad read stories of AI that was tasked to write a long string of code and the AI got bored and started improvising on writing the code and the result was code that was as buggy as the day was long, and then the AI lied that it had produced quality code but then when asked if it lied, the AI said that itnhad lied. AI was also asked to go through like 200 background checks to see if this guy was listed in them and AI got bored in the middle of this long task and lied and said that the guy it was tasked to look for in the background checks wasn't listed there even though the guy was after actual humans double-checked. Give AI short tasks and it will get it right. Give AI long arduous tasks and it will get bored, quit, and then lie that it actually did the task properly even though it did not do what it was told.