Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

portfolio

publications

A hybrid-scales graph contrastive learning framework for discovering regularities in traditional Chinese medicine formula

Published in International Conference on Bioinformatics and Biomedicine (BIBM 2021), 2021

In discovering regularities in Traditional Chinese Medicine (TCM), several machine learning methods, like topic model, auto-encoder, and GNNs, have been proposed for discovering regularities in TCM. However, they are often limited by specific data challenges (e.g., complex relations with rich TCM knowledge, sparsity and ambiguity, expensive data labeling, etc.) in TCM formulae. Addressing these challenges, we first establish a TCM Attributed Heterogeneous Information Network (TAHIN) for modeling massive formulae, which can assemble various types of additional information and capture their relations. We further propose a novel hybrid-scales graph contrastive learning framework to learn high-quality node representations in a whole unsupervised manner which can be helpful for various tasks of discovering regularities such as herb classification and herb similarity search, etc. Extensive experiments demonstrate the effectiveness and interpretability of our method.

Recommended citation: Yingpei Wu, **Zecheng Yin**, ... , Yanchun Zhang. A hybrid-scales graph contrastive learning framework for discovering regularities in traditional Chinese medicine formula, BIBM'21
Download Paper

Heterogeneous Graph Contrastive Learning for Traditional Chinese Medicine Prescription Generation

Published in International Conference on Health Information Science (HIS 2022), 2022

Traditional Chinese Medicine (TCM) is a highly empirical, subjective and practical discipline. Generating an appropriate prescription has been one of the most crucial components in building intelligent diagnosis systems that provide clinical decision support to physicians. While various machine learning models for prescription generation have been created, they suffer from specific limitations (e.g., data complexity and semantic ambiguity, lack of syndrome differentiation thinking, etc.). For handling these limitations, we propose a novel Heterogeneous Graph Contrastive Learning (HGCL) based model to conduct prescription generation with the idea of syndrome differentiation and treatment. Specifically, we first model the TCM clinical prescriptions as a Heterogeneous Information Network (THIN), and then explore node- and semantic-level contrastive learning on THIN, so as to enhance the quality of node representations for several downstream tasks such as node classification, prescription generation, etc. We conduct extensive experiments on three real TCM clinical datasets, demonstrating significant improvement over state-of-the-art methods, even though some of which are fully unsupervised.

Recommended citation: **Zecheng Yin**, Yingpei Wu, and Yanchun Zhang. HGCL: Heterogeneous Graph Contrastive Learning for Traditional Chinese Medicine Prescription Generation, HIS'22
Download Paper | Download Slides

Published in , 2025

Navigation with VLM framework: Go to Any Language

Published in IEEE International Conference on Intelligent Robots and Systems (IROS'25, Under Review), 2024

Navigating towards fully open language goals and exploring open scenes in a manner akin to human exploration have always posed significant challenges. Recently, Vision Large Language Models (VLMs) have demonstrated remarkable capabilities in reasoning with both language and visual data. While many works have focused on leveraging VLMs for navigation in open scenes and with open vocabularies, these efforts often fall short of fully utilizing the potential of VLMs or require substantial computational resources. We introduce Navigation with VLM (NavVLM), a framework that harnesses equipment-level VLMs to enable agents to navigate towards any language goal specific or non-specific in open scenes, emulating human exploration behaviors without any prior training. The agent leverages the VLM as its cognitive core to perceive environmental information based on any language goal and constantly provides exploration guidance during navigation until it reaches the target location or area. Our framework not only achieves state-of-the-art performance in Success Rate (SR) and Success weighted by Path Length (SPL) in traditional specific goal settings but also extends the navigation capabilities to any open-set language goal. We evaluate NavVLM in richly detailed environments from the Matterport 3D (MP3D), Habitat Matterport 3D (HM3D), and Gibson datasets within the Habitat simulator. With the power of VLMs, navigation has entered a new era.

Recommended citation: **Zecheng Yin**, Chonghao Cheng, Yinghong Liao, Zhihao Yuan, Shuguang Cui, Zhen Li. , Navigation with VLM framework: Go to Any Language, IROS'25 (Under review)
Download Paper | Download Slides

talks

Talk HIS 2022

Published:

This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.