Computers just got a lot better at writing
发布时间 2020-03-04 13:00:10 来源
摘要
How machines can mimic our language.
Join the Open Sourced Reporting Network: http://www.vox.com/opensourcednetwork
Something big happened in the past year: Researchers created computer programs that can write long passages of coherent, original text.
Language models like GPT-2, Grover, and CTRL create text passages that seem written by someone fluent in the language, but not in the truth. That AI field, Natural Language Processing (NLP), didn’t exactly set out to create a fake news machine. Rather, it’s the byproduct of a line of research into massive pretrained language models: Machine learning programs that store vast statistical maps of how we use our language. So far, the technology’s creative uses seem to outnumber its malicious ones. But it’s not difficult to imagine how these text-fakes could cause harm, especially as these models become widely shared and deployable by anyone with basic know-how. Read more here: https://www.vox.com/recode/2020/3/4/21163743/ai-language-generation-fake-text-gpt2
Open Sourced is a year-long reporting project from Recode by Vox that goes deep into the closed ecosystems of data, privacy, algorithms, and artificial intelligence. Learn more at http://www.vox.com/opensourced
This project is made possible by the Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
Watch all episodes of Open Sourced right here on YouTube: http://bit.ly/2tIHftD
Try out natural language generation and detection with these tools:
https://demo.allennlp.org/next-token-lm
https://talktotransformer.com/
https://transformer.huggingface.co/
https://grover.allenai.org/
https://www.ai21.com/haim
http://gltr.io/
https://play.aidungeon.io/
https://huggingface.co/openai-detector/
Sources:
https://ruder.io/nlp-imagenet/
https://medium.com/@ageitgey/deepfaking-the-news-with-nlp-and-transformer-models-5e057ebd697d
https://openai.com/blog/better-language-models/
https://blog.einstein.ai/introducing-a-conditional-transformer-language-model-for-controllable-generation/
https://veredshwartz.blogspot.com/2019/08/text-generation.html
http://www.mattkenney.me/gpt-2-345/
http://www.mattkenney.me/gpt-2/
https://jalammar.github.io/illustrated-gpt2/
https://mc.ai/introduction-to-language-modelling-and-deep-neural-network-based-text-generation/
https://fortune.com/2020/01/20/natural-language-processing-business/
https://www.vox.com/future-perfect/2019/2/14/18222270/artificial-intelligence-open-ai-natural-language-processing
https://www.newyorker.com/magazine/2019/10/14/can-a-machine-learn-to-write-for-the-new-yorker
https://www.youtube.com/watch?v=GEtbD6pqTTE
https://arxiv.org/pdf/1905.12616.pdf
https://arxiv.org/abs/1911.03343
https://arxiv.org/abs/1904.09751
https://techscience.org/a/2019121801/
https://www.middlebury.edu/institute/sites/www.middlebury.edu.institute/files/2019-11/The%20Industrialization%20of%20Terrorist%20Propaganda%20-%20CTEC.pdf?fv=TzdJnlDw
http://newsyoucantuse.com/
https://aiweirdness.com/post/168051907512/the-first-line-of-a-novel-by-an-improved-neural
https://aiweirdness.com/post/159302925452/the-neural-network-generated-pickup-lines-that-are
https://www.nytimes.com/interactive/2018/10/26/opinion/halloween-spooky-costumes-machine-learning-generator.html
https://aiweirdness.com/post/160985569682/paint-colors-designed-by-neural-network-part-2
https://www.reddit.com/r/SubSimulatorGPT2/
https://twitter.com/dril_gpt2
https://cloud.google.com/text-to-speech/
Vox.com is a news website that helps you cut through the noise and understand what's really driving the events in the headlines. Check out http://www.vox.com.
Watch our full video catalog: http://goo.gl/IZONyE
Follow Vox on Facebook: http://goo.gl/U2g06o
Or Twitter: http://goo.gl/XFrZ5H
GPT-4正在为你翻译摘要中......