差别

这里会显示出您选择的修订版和当前版本之间的差别。

到此差别页面的链接

两侧同时换到之前的修订记录前一修订版
后一修订版
前一修订版
it:ai:aiassistance [2023-08-16 12:56] – [解决方案] goldentianyait:ai:aiassistance [2023-08-19 07:13] (当前版本) – [解决方案] goldentianya
行 16: 行 16:
     - [[https://www.youtube.com/watch?v=ySus5ZS0b94|OpenAI Embeddings and Vector Databases Crash Course]] {{fa>youtube}}     - [[https://www.youtube.com/watch?v=ySus5ZS0b94|OpenAI Embeddings and Vector Databases Crash Course]] {{fa>youtube}}
     - [[https://www.youtube.com/watch?v=N4nX_rTwKx4&t=318s|OpenAI GPT3 with a Databases - Crash Course]] {{fa>youtube}}     - [[https://www.youtube.com/watch?v=N4nX_rTwKx4&t=318s|OpenAI GPT3 with a Databases - Crash Course]] {{fa>youtube}}
 +  - 文献和思路
 +    - [[https://www.51cto.com/article/760672.html|从通才到专才:Fine-tuning与Embedding探索]]
    
 **语言模型** **语言模型**
行 35: 行 37:
   - [[https://huggingface.co/docs/huggingface.js/inference/README|inference api]] ([[https://huggingface.co/inference-api|Price]] 滚动到底下)   - [[https://huggingface.co/docs/huggingface.js/inference/README|inference api]] ([[https://huggingface.co/inference-api|Price]] 滚动到底下)
   - [[https://huggingface.co/docs/transformers.js|Transformer.js]]   - [[https://huggingface.co/docs/transformers.js|Transformer.js]]
 +    - [[https://huggingface.co/Xenova|xenova 可以直接用的模型]]
   - [[https://huggingface.co/THUDM/codegeex2-6b|CodeGeex2]] / [[https://huggingface.co/bigcode/starcoder/tree/main|Starcoder(70GB)]] 最新代码模型 {{fa>book}}   - [[https://huggingface.co/THUDM/codegeex2-6b|CodeGeex2]] / [[https://huggingface.co/bigcode/starcoder/tree/main|Starcoder(70GB)]] 最新代码模型 {{fa>book}}
     - [[https://www.bilibili.com/read/cv23898887/|使用 StarCoder 创建一个编程助手]]     - [[https://www.bilibili.com/read/cv23898887/|使用 StarCoder 创建一个编程助手]]
行 52: 行 55:
   - 如果没有ONNX的需要自行编译,因为Transformer本身是从python上来的。(参考{{fa>book}}[[https://huggingface.co/docs/transformers.js/custom_usage|onnx]])   - 如果没有ONNX的需要自行编译,因为Transformer本身是从python上来的。(参考{{fa>book}}[[https://huggingface.co/docs/transformers.js/custom_usage|onnx]])
  
 +===== 语言模型 =====
 +
 +Feature Extraction, Transformer, zh/en/de:
 +  * [[https://huggingface.co/setu4993/smaller-LaBSE/tree/main|smaller-LaBSE]] 
 +  * [[https://huggingface.co/setu4993/LEALLA-large/tree/main|LEALLA-large]] / [[https://huggingface.co/setu4993/LEALLA-base/tree/main|LEALLA-base]]
 +
 +Question Answering
 +  * [[https://huggingface.co/IProject-10/xlm-roberta-base-finetuned-squad2/tree/main|Facebook 训练的模型]]
 +
 +Text generation
 +  * [[https://huggingface.co/facebook/xglm-564M/tree/main|xglm-564M]] (FB 的同模型4.5B参数需要10GB文件,太大了)
 +
 +**Langchain**
 +  * [[https://js.langchain.com/docs/get_started/installation|js.langchain]] {{fa>book}}
 +    * [[https://js.langchain.com/docs/api/chains/classes/RetrievalQAChain|RetrievalQAChain]]
 +    * [[https://js.langchain.com/docs/modules/model_io/models/llms/integrations/huggingface_inference|llms hf-inference]] / [[https://js.langchain.com/docs/modules/data_connection/text_embedding/integrations/hugging_face_inference|embedding inference]]
 +  * [[https://js.langchain.com/docs/modules/data_connection/vectorstores/integrations/mongodb_atlas|MongoDB]]
 ===== Task描述 ===== ===== Task描述 =====
  
it/ai/aiassistance.1692190581.txt.gz · 最后更改: 2023-08-16 12:56 由 goldentianya
回到顶部
CC Attribution-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0