PUBLICATIONS

2019

SemiHD: Semi-Supervised Learning Using Hyperdimensional Computing


2019

CompHD: Efficient Hyperdimensional Computing Using Model Compression


2018

Particle shape accounts for instrumental discrepancy in ice core dust size distributions

  • Journal: Climate of the Past / European Geosciences Union
  • Work at the Niels Bohr Institute, University of Copenhagen
  • Group of Prof. Dorthe Dahl-Jensen
  • Contribution: significant progress in measurement techniques for ice and climate science
  • My contributions: statistical analysis of dust particle shapes and its reflection in a laser beam, experimental work

SUBMITTED PAPERS

2019

QuantHD: A Quantization Framework for FPGA Acceleration of Hyperdimensional Computing

  • Work at University of California, San Diego
  • Group of prof. Tajana Rosing
  • We invented a new machine learning algorithm which can achieve on average 34.1x and 4.1x energy efficiency improvement on training and testing respectively, and 8.2x and 13.4x faster computing in training and testing. It provides similar classification accuracy as low-cost state-of-the-art Machine Learning algorithms

2019

Bit-Serial HD: Approximate Hyperdimensional Computing in Machine Learning

  • Work at University of California, San Diego and EPFL
  • Groups of Prof. Tajana Rosing (UCSD) and Prof. Giovanni De Micheli (EPFL)
  • We invented a new HD computing acceleration framework, which significantly reduces the power consumption of Machine Learning algorithms, while maintaining almost the same classification accuracy 

2019

AdaptHD: Adaptive Efficient Training forBrain-Inspired Hyperdimensional Computing

  • Work at University of California, San Diego
  • Group of prof. Tajana Rosing
  • AdaptHD introduces the definition of learning rate in HD computing and proposes two approaches for adaptive training: iteration-dependent and data-dependent. In the iteration-dependent approach, AdaptHD uses a large learning rate to speedup the training procedure in the first iterations, and then adaptively reduces the learning rate depending on the slope of the error rate.

2019

LookHD: Acceleration of Hyperdimensional Computing Exploiting Computation Reuse

  • Work at University of California, San Diego
  • Group of prof. Tajana Rosing
  • We invented LookHD: a method which compresses a HD algorithm matrix into a single vector without significantly decreasing classification accuracy. This method is 2.2x faster and 4.1x more energy efficient, as compared to existing HD computing algorithms

2019

Global increase in atmospheric dust content over the last 300 years

  • Work at Niels Bohr Institute, University of Copenhagen
  • Group of Prof. Dorthe Dahl-Jensen & Prof. Paul Vallelonga
  • Contribution: discovery of a large increase in atmospheric dust content, which will have significant impact on our planet's climate