Welcome to visit Chenping Hou's HomePage
Visiting Student in Bigeye Lab, Tsinghua University, Supervised by Professor Changshui Zhang, 2008.
Visiting Scholar in QCIS, UTS, Supervised by Professor Dacheng Tao, 2013.
Visiting Scholar in Lamda, Nanjing Univesity, Supervised by Professor Zhi-Hua Zhou, 2016.
1. C. Hou, and Z.-H. Zhou. One-Pass Learning with Incremental and Decremental Features. Preprint In arxiv.
We focus on the problem with both feature and instance evolving and propose an one-pass learning method to solve this problem.
1. C. Hou, F.Nie, D. Yi and D. Tao. Discriminative embeded clustering: A general framework for grouping high dimensional data. Accepted by IEEE TNNLS, 2014.
We focus on the unsupervised problem of learning subspace for clustering, which have been employed sequencely in traditional applications. It is a genaral framework that can analyze many methods in a unified view.
2. C. Hou, C.Zhang, Y. Wu, Y. Jiao. Stable local dimensionality reduction approaches, Pattern Recognition, Volume 42, Issue 9, September 2009, Pages 2054-2066.
We focus on the problem of enhance the stability of local dimensionality reduction approaches. We have proposed a novel framework which can add global information for local methods.
3. C. Hou, J.Wang, Y. Wu, D. Yi. Local linear transformation embedding, Neurocomputing, Volume 72, Issues 10–12, June 2009, Pages 2368-2378.
We focus on the problem of alleviate the local sensitivity of LLE. We compute the local approximation weight in a targent space.
4. C. Hou, F.Nie, C. Zhang, Y. Wu. Learning an orthogonal and smooth subspace for image classification. Signal Processing Letters, IEEE, 16 (4): 303-306, 2009.
We focus on the problem of subspace learning for image classification. We add spatial smooth regularizer and use orthogonal constraint in learning subspace.
5. C. Hou, F. Nie,C. Zhang, Y. Wu. Learning a Subspace for Clustering via Pattern Shrinking. Information Processing & Management (IPM), 49(4):871-883, 2013.
We focus on the problem of learning subspace for clustering. We propose a nove strategy, named as pattern shinking to maintain the nonlinear structure in linear subspace learning.
1. C. Hou, F.Nie, X. Li, D. Yi, Y. Wu. Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection. IEEE Transactions on Cybernetics,44(6): 793 – 804, 2014.
We focus on the problem of unsupervised feature selection. A general framework which can joint embedding learning and sparse regression has been proposed. It enables traditional dimensionality reduction method, i.e., spectral embedding, for feature selection by adding the sparse constaints.
2. C. Hou, F.Nie, D. Yi, Y Wu. Feature selection via joint embedding learning and sparseregression. Proceedings of the Twenty-Second international joint conference on Artificial Intelligence, vol. 2, pp. 1324-1329, 2011.
We focus on the problem of unsupervised feature selection. It is a short version of the paper "Joint embedding learning and sparse regression: a general framework for unsupervised feature selection".
1. C. Hou, F.Nie, D. Yi, Y. Wu. Efficient image classification via multiple rank regression.IEEE Transactions on Image Processing, 22 (1): 340-352, 2013.
We focus on the problem of image data classification. We propose a novel multiple rank regresion model to classify image data directly. We also reveal the essence of the rank of regresion. It can be regarded as a general extension of traditional linear regression method.
2. C. Hou, F.Nie, C. Zhang, D. Yi, Y. Wu. Multiple rank multi linear SVM for matrix data classification. Pattern Recognition 47 (1): 454-469, 2014.
We focus on the problem of enabling linear SVM manipulating matrix data directly. A general method which uses multiple rank multiple linear regression as the constraints of linear SVM has been presented.
1. C. Hou, F.Nie, F. Wang, C. Zhang, Y. Wu. Semisupervised Learning Using Negative Labels. IEEETransactions on Neural Networks, 22 (3), Mar. 2011, pp. 420-432.
We focus on the problem of semi supervised learning. We proposed a new kind of label, i.e., negative label, and also demonstrate its effectiveness in supervised learning.
2. C. Hou,C. Zhang, Y. Wu, F. Nie. Multiple view semi-supervised dimensionality reduction, Pattern Recognition, Volume 43, Issue3, March 2010, Pages 720-730.
We focus on the problem of semi supervised dimensionalty reduction on multiple view data. It is proposed to use link information for multiple view dimensionality reduction.
Updated on:2017-05-05 08:23 Total Visits:12622