Self-Harmonized Chain of Thought

Page view(s)
1
Checked on Aug 29, 2025
Self-Harmonized Chain of Thought
Title:
Self-Harmonized Chain of Thought
Journal Title:
North American Chapter of the Association for Computational Linguistics (NACCL)
Keywords:
Publication Date:
29 April 2025
Citation:
Ziqi Jin and Wei Lu. 2025. Self-Harmonized Chain of Thought. In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 1153–1174, Albuquerque, New Mexico. Association for Computational Linguistics.
Abstract:
Chain-of-thought (CoT) prompting has demonstrated the capacity of large language models to perform complex reasoning through intermediate steps. While effective, current CoT methods face challenges: Zero-shot-CoT can lead to reasoning errors, and Few-shot-CoT requires labor-intensive manual demonstrations. Auto-CoT attempts to address these issues by automatically generating diverse demonstrations, but this diversity can lead to inconsistent reasoning patterns. We propose ECHO (Self-Harmonized Chain of Thought), a novel method that unifies diverse solution paths into a consistent and effective reasoning pattern. ECHO employs an iterative process to refine and harmonize automatically generated demonstrations, mitigating the limitations of existing approaches. Our comprehensive experiments across arithmetic, commonsense, and symbolic reasoning tasks demonstrate that ECHO outperforms Auto-CoT by an average of 2.8%. These findings suggest that ECHO represents a significant step towards more robust and generalizable automated reasoning in large language models.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research / project is supported by the National Research Foundation - AI Singapore Programme
Grant Reference no. : AISG2-TC-2023-013

This research / project is supported by the Ministry of Education - Academic Research Fund (AcRF) Tier 2 Programme
Grant Reference no. : MOET2EP20122- 0011

This research / project is supported by the Ministry of Education - Academic Research Fund (AcRF) Tier 3 Programme
Grant Reference no. : MOET32020-0004
Description:
ACL materials are Copyright © 1963–2025 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License.
ISSN:
Nil