{"id":5777,"date":"2024-12-27T20:13:21","date_gmt":"2024-12-27T18:13:21","guid":{"rendered":"https:\/\/talkingtochatbots.com\/bing\/technology-and-ai\/examples-of-text-based-generative-ai-models-including-llms\/"},"modified":"2024-12-30T12:11:24","modified_gmt":"2024-12-30T10:11:24","slug":"examples-of-text-based-generative-ai-models-including-llms","status":"publish","type":"page","link":"https:\/\/talkingtochatbots.com\/es\/bing\/technology-and-ai\/examples-of-text-based-generative-ai-models-including-llms\/","title":{"rendered":"Ejemplos de modelos de IA generativos basados en texto, incluidos LLM"},"content":{"rendered":"\r\n<div id=\"output\" style=\"margin-top:20px; padding:10px; border:1px solid #ccc; background:#e4e4e4;\">\r\n    Loading data from <a href=\"https:\/\/huggingface.co\/datasets\/reddgr\/talking-to-chatbots-chats\" target=\"_blank\" rel=\"noopener\" title=\"\">reddgr\/talking-to-chatbots-chats<\/a> &#8230;\r\n<\/div>\r\n\r\n<script>\r\n    document.addEventListener(\"DOMContentLoaded\", () => {\r\n        const args = {\r\n            dataset: \"reddgr\/talking-to-chatbots-chats\",\r\n            config: \"default\",\r\n            split: \"train\",\r\n            conversation_id: \"94780168-5f25-4d94-a97d-cd50b2e4867b\",\r\n            offset: 0,\r\n            length: 1\r\n        };\r\n        window.apiCallArgs = args;\r\n    });\r\n<\/script>\r\n\r\n<script src=\"\/wp-content\/uploads\/js\/chat_header_v04.js\"><\/script>\r\n\r\n\r\n\r\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\r\n\r\n\ud83d\ude0e what are some examples of text-based generative AI models? Complete the list, starting with LLM&#8217;s&#8230;\r\n<blockquote>\ud83e\udd16 Text-based generative AI models are algorithms that can create new and realistic text content based on a given input. They use deep learning techniques such as transformers to learn from large amounts of data and generate human-like text. Some examples of text-based generative AI models are:<br><br>\u2022  LLMs (Large Language Models): These are models that are trained on a broad set of unlabeled data that can be used for different tasks, with additional fine-tuning. They are also known as AI foundation models <a href=\"https:\/\/www.mckinsey.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/www.mckinsey.com\" target=\"_blank\" rel=\"nofollow noopener\">[www.mckinsey.com]<\/a><\/a>  Some examples of LLMs are GPT-3, Jurassic-1, Jurassic-X, and Cohere Generate <a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\">[bing.com]<\/a><\/a><\/a><\/a><\/a> <br><br>\u2022  BERT (Bidirectional Encoder Representations from Transformers): This is a model that uses bidirectional transformers to learn from both left and right context of a given word. It can be used for tasks such as question answering, sentiment analysis, and named entity recognition <a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\">[bing.com]<\/a><\/a><\/a><\/a><\/a> <br><br>\u2022  RoBERTa (Robustly Optimized BERT Pretraining Approach): This is a model that improves upon BERT by using more data, larger batches, and more training steps. It also removes the next sentence prediction objective and uses byte-pair encoding as a subword tokenizer <a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\">[bing.com]<\/a><\/a><\/a><\/a><\/a> <br><br>\u2022  Luminous: This is a model that uses self-attention to generate text from structured data, such as tables or graphs. It can be used for tasks such as data-to-text generation, text summarization, and natural language generation <a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\">[bing.com]<\/a><\/a><\/a><\/a><\/a> <br><br>These are some of the best AI-based text generation language models available today. They have various applications and use cases in different domains and industries. <a href=\"https:\/\/www.mckinsey.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/www.mckinsey.com\" target=\"_blank\" rel=\"nofollow noopener\">[www.mckinsey.com]<\/a><\/a>  <a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\"><a href=\"https:\/\/bing.com\" target=\"_blank\" rel=\"nofollow noopener\">[bing.com]<\/a><\/a><\/a><\/a><\/a><\/blockquote>\r\n\r\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\r\n\r\n\r\n\r\n<h2 class=\"wp-block-heading\">Hugging Face Dataset Metrics<\/h2>\r\n\r\n\r\n\r\n<p>All the conversation prompts, responses, and metrics are available to download and explore on Hugging Face dataset <a href=\"https:\/\/huggingface.co\/datasets\/reddgr\/talking-to-chatbots-chats\" target=\"_blank\" rel=\"noopener\" title=\"\">reddgr\/talking-to-chatbots-chats<\/a>:<\/p>\r\n\r\n\r\n\r\n<iframe\r\n  src=\"https:\/\/huggingface.co\/datasets\/reddgr\/talking-to-chatbots-chats\/embed\/viewer\/default\/train?f[conversation_id][value]=%2794780168-5f25-4d94-a97d-cd50b2e4867b%27\"\r\n  frameborder=\"0\"\r\n  width=\"100%\"\r\n  height=\"390px\"\r\n><\/iframe>\r\n\r\n\r\n\r\n<iframe\r\n  src=\"https:\/\/huggingface.co\/datasets\/reddgr\/talking-to-chatbots-unwrapped-chats\/embed\/viewer\/default\/train?f[conversation_id][value]=%2794780168-5f25-4d94-a97d-cd50b2e4867b%27&#038;sort[column]=turn&#038;sort[direction]=asc\"\r\n  frameborder=\"0\"\r\n  width=\"100%\"\r\n  height=\"460px\"\r\n><\/iframe>\r\n","protected":false},"excerpt":{"rendered":"<p>Cargando datos de reddgr\/talking-to-chatbots-chats\u2026 \ud83d\ude0e \u00bfCu\u00e1les son algunos ejemplos de modelos de IA generativa basados en texto? Completa la lista, empezando por los LLM\u2026 \ud83e\udd16 Los modelos de IA generativa basados en texto son algoritmos que pueden crear contenido textual nuevo y realista bas\u00e1ndose en un\u2026<\/p>\n<p class=\"read-more\"> <a class=\"more-link\" href=\"https:\/\/talkingtochatbots.com\/es\/bing\/technology-and-ai\/examples-of-text-based-generative-ai-models-including-llms\/\"> <span class=\"screen-reader-text\">Ejemplos de modelos de IA generativos basados en texto, incluidos LLM<\/span> Leer m\u00e1s \u00bb<\/a><\/p>","protected":false},"author":0,"featured_media":0,"parent":5678,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"class_list":["post-5777","page","type-page","status-publish","hentry"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/pages\/5777","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/types\/page"}],"replies":[{"embeddable":true,"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/comments?post=5777"}],"version-history":[{"count":3,"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/pages\/5777\/revisions"}],"predecessor-version":[{"id":6212,"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/pages\/5777\/revisions\/6212"}],"up":[{"embeddable":true,"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/pages\/5678"}],"wp:attachment":[{"href":"https:\/\/talkingtochatbots.com\/es\/wp-json\/wp\/v2\/media?parent=5777"}],"curies":[{"name":"gracias","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}