两侧同时换到之前的修订记录前一修订版后一修订版 | 前一修订版 |
it:ai:aiassistance [2023-08-16 12:54] – [解决方案] goldentianya | it:ai:aiassistance [2023-08-19 07:13] (当前版本) – [解决方案] goldentianya |
---|
- [[https://www.youtube.com/watch?v=ySus5ZS0b94|OpenAI Embeddings and Vector Databases Crash Course]] {{fa>youtube}} | - [[https://www.youtube.com/watch?v=ySus5ZS0b94|OpenAI Embeddings and Vector Databases Crash Course]] {{fa>youtube}} |
- [[https://www.youtube.com/watch?v=N4nX_rTwKx4&t=318s|OpenAI GPT3 with a Databases - Crash Course]] {{fa>youtube}} | - [[https://www.youtube.com/watch?v=N4nX_rTwKx4&t=318s|OpenAI GPT3 with a Databases - Crash Course]] {{fa>youtube}} |
| - 文献和思路 |
| - [[https://www.51cto.com/article/760672.html|从通才到专才:Fine-tuning与Embedding探索]] |
| |
**语言模型** | **语言模型** |
- [[https://huggingface.co/docs/huggingface.js/inference/README|inference api]] ([[https://huggingface.co/inference-api|Price]] 滚动到底下) | - [[https://huggingface.co/docs/huggingface.js/inference/README|inference api]] ([[https://huggingface.co/inference-api|Price]] 滚动到底下) |
- [[https://huggingface.co/docs/transformers.js|Transformer.js]] | - [[https://huggingface.co/docs/transformers.js|Transformer.js]] |
| - [[https://huggingface.co/Xenova|xenova 可以直接用的模型]] |
- [[https://huggingface.co/THUDM/codegeex2-6b|CodeGeex2]] / [[https://huggingface.co/bigcode/starcoder/tree/main|Starcoder(70GB)]] 最新代码模型 {{fa>book}} | - [[https://huggingface.co/THUDM/codegeex2-6b|CodeGeex2]] / [[https://huggingface.co/bigcode/starcoder/tree/main|Starcoder(70GB)]] 最新代码模型 {{fa>book}} |
- [[https://www.bilibili.com/read/cv23898887/|使用 StarCoder 创建一个编程助手]] | - [[https://www.bilibili.com/read/cv23898887/|使用 StarCoder 创建一个编程助手]] |
- [[https://tianchi.aliyun.com/course/956?spm=a2c22.28136470.0.0.57993f2czRiSez&from=search-list|基于transformers的自然语言处理]] | - [[https://tianchi.aliyun.com/course/956?spm=a2c22.28136470.0.0.57993f2czRiSez&from=search-list|基于transformers的自然语言处理]] |
| |
不错的语言模型 | |
* [[https://huggingface.co/setu4993/smaller-LaBSE/tree/main|smaller-LaBSE]] | |
* [[https://huggingface.co/setu4993/LEALLA-large/tree/main|LEALLA-large]] / [[https://huggingface.co/setu4993/LEALLA-base/tree/main|LEALLA-base]] | |
| |
搜索支持Transformer的models | 搜索支持Transformer的models |
- 如果没有ONNX的需要自行编译,因为Transformer本身是从python上来的。(参考{{fa>book}}[[https://huggingface.co/docs/transformers.js/custom_usage|onnx]]) | - 如果没有ONNX的需要自行编译,因为Transformer本身是从python上来的。(参考{{fa>book}}[[https://huggingface.co/docs/transformers.js/custom_usage|onnx]]) |
| |
| ===== 语言模型 ===== |
| |
| Feature Extraction, Transformer, zh/en/de: |
| * [[https://huggingface.co/setu4993/smaller-LaBSE/tree/main|smaller-LaBSE]] |
| * [[https://huggingface.co/setu4993/LEALLA-large/tree/main|LEALLA-large]] / [[https://huggingface.co/setu4993/LEALLA-base/tree/main|LEALLA-base]] |
| |
| Question Answering |
| * [[https://huggingface.co/IProject-10/xlm-roberta-base-finetuned-squad2/tree/main|Facebook 训练的模型]] |
| |
| Text generation |
| * [[https://huggingface.co/facebook/xglm-564M/tree/main|xglm-564M]] (FB 的同模型4.5B参数需要10GB文件,太大了) |
| |
| **Langchain** |
| * [[https://js.langchain.com/docs/get_started/installation|js.langchain]] {{fa>book}} |
| * [[https://js.langchain.com/docs/api/chains/classes/RetrievalQAChain|RetrievalQAChain]] |
| * [[https://js.langchain.com/docs/modules/model_io/models/llms/integrations/huggingface_inference|llms hf-inference]] / [[https://js.langchain.com/docs/modules/data_connection/text_embedding/integrations/hugging_face_inference|embedding inference]] |
| * [[https://js.langchain.com/docs/modules/data_connection/vectorstores/integrations/mongodb_atlas|MongoDB]] |
===== Task描述 ===== | ===== Task描述 ===== |
| |