These days, browsing Twitter or Instagram , it is most likely that you have come across by chance —or not— with one of these 2 contents: either a screenshot in which you can read a question and an answer, or a drawing or illustration of someone you follow on social media.
Both content, text and images, are generated by artificial intelligence (AI).
The accessibility and effectiveness of these technological tools have caused many users to interact with them recently and, moreover, have wanted to share the result of “their” creations with their followers .
In the case of AI-generated text, the tool is called ChatGPT and has been developed by Open AI , the company co-founded by Twitter’s new owner, Elon Musk . It can be accessed through a web browser, free of charge —although you have to register beforehand— and allows you to make all kinds of queries, from a philosophical dissertation to a programming language question .
For its part, the illustrations are provided by the Magic Avatars functionfrom Lensa , a photo editing tool that uses technology from Open AI competitor Stability AI . In the case of Lensa, users can access it for free through the Android or Apple app store, although there is the option of accessing premium benefits by paying up to 30 euros .
Despite the interest they are arousing among users, both tools have generated skepticism among experts .
The founder of verne and journalist for La Vanguardia , Delia Rodriguez, defended through her Twitter account that technologies such as ChatGPT can be eminently problematic for tasks such as digital journalism and exemplified this with a response from the bot itself in which it explained why why this AI could be an example of a rogue narrator .
Privacy and data policy researcher at the Stanford Institute, Jennifer King, told the Wall Street Journal that “these kinds of tools tend to be flashy,” referring to how interesting it can be to try them out to share your results on networks, but says that “Without the proper guardrails they get you into a lot of trouble .”