00:00:00

Share Your Feedback 🏝️

Graph Based Prompting

Graph Based Prompting

MinWoo(Daniel) Park | Tech Blog

Read more
Previous: Vector Search | Lucene Next: Survey | Instruction Tuning Survey**

Graph Based Prompting

  • Related Project: Private
  • Category: Paper Review
  • Date: 2023-08-30

Graph-Based Prompting and Reasoning with Language Models


References

  • [1] Yao, Yao, Zuchao Li, and Hai Zhao. “Beyond Chain-of-Thought, Effective Graph-of-Thought Reasoning in Large Language Models.” arXiv preprint arXiv:2305.16582 (2023).
  • [2] Besta, Maciej, et al. “Graph of Thoughts: Solving Elaborate Problems with Large Language Models.” arXiv preprint arXiv:2308.09687 (2023).
  • [3] Zhang, Zhuosheng, et al. “Multimodal chain-of-thought reasoning in language models.” arXiv preprint arXiv:2302.00923 (2023).
  • [4] Dosovitskiy, Alexey, et al. “An image is worth 16x16 words: Transformers for image recognition at scale.” arXiv preprint arXiv:2010.11929 (2020).
  • [5] Raffel, Colin, et al. “Exploring the limits of transfer learning with a unified text-to-text transformer.” The Journal of Machine Learning Research 21.1 (2020): 5485-5551.
  • [6] Carion, Nicolas, et al. “End-to-end object detection with transformers.” European conference on computer vision. Cham: Springer International Publishing, 2020.
  • [7] Brown, Tom, et al. “Language models are few-shot learners.” Advances in neural information processing systems 33 (2020): 1877-1901.
  • [8] Wei, Jason, et al. “Chain-of-thought prompting elicits reasoning in large language models.” Advances in Neural Information Processing Systems 35 (2022): 24824-24837.
  • [9] Yao, Shunyu, et al. “Tree of thoughts: Deliberate problem solving with large language models.” arXiv preprint arXiv:2305.10601 (2023).
  • [10] Wang, Xuezhi, et al. “Self-consistency improves chain of thought reasoning in language models.” arXiv preprint arXiv:2203.11171 (2022).
  • [11] Vaswani, Ashish, et al. “Attention is all you need.” Advances in neural information processing systems 30 (2017).
  • [12] Devlin, Jacob, et al. “Bert: Pre-training of deep bidirectional transformers for language understanding.” arXiv preprint arXiv:1810.04805 (2018).
  • [13] Kipf, Thomas N., and Max Welling. “Semi-supervised classification with graph convolutional networks.” arXiv preprint arXiv:1609.02907 (2016).
  • [14] Veličković, Petar, et al. “Graph attention networks.” arXiv preprint arXiv:1710.10903 (2017).
Previous: Vector Search | Lucene Next: Survey | Instruction Tuning Survey**

post contain ""

    No matching posts found containing ""