Yuhao Yang, Software Engineer at Intel

Yuhao Yang

Software Engineer, Intel

Yuhao Yang currently is a software engineer at Intel, focusing on providing implementation, consultant and tuning advice on Hadoop ecosystem to industry partners. His area of focus is distributed machine learning, especially large-scale analytical applications and infrastructure on Spark. He’s also an active contributor of Spark MLlib (50+ patches), delivered the implementation of online LDA, QR decomposition and several transformers of Spark feature engineering, and provided improvements on some important algorithms.


Embrace Sparsity At Web Scale: Apache Spark MLlib Algorithms Optimization For Sparse Data

From purchase history to movie ratings, data sparsity has always been one of the primary characteristics of big data. Powerful as Apache Spark is on parallel processing for the partitioned data, many of the algorithms… Read more