Preprint

1. C. Hou, F. Nie, T. Tao and D. Yi. Multi-view Unsupervised Feature Selection  with Adaptive Similarity and View Weights. Accepted by TKDE.

2. C. Hou, Y. Jiao, F. Nie, T. Luo and Z. Zhou, TwoDimensional Feature Selection by Sparse Matrix Regression, IEEE TIP Accepted.

3. W. Zhuge, C. Hou, Y. Jiao, J. Yue, H. Tao and D. Yi.Robust Auto-weighted Multi-view Subspace Clustering with Common SubspaceRepresentation Matrix, Plos One, Accepted.

4. H. Tao, C. Hou, F. Nie, J. Zhu and D. Yi. ScalableMulti-View Semi-Supervised Classification via Adaptive Regression, IEEE TIP.

5. C. Hou and Z.-H.Zhou: One-Pass Learning with Incremental and Decremental Features. In ArXiv. (OPID.pdf)

2016

1.T. Luo, C. Hou, D. Yi, J. Zhang: Discriminative orthogonal elastic preserving projections for classification. Neurocomputing 179: 54-68 (2016)(DOEPP.pdf)

2.H. Tao, C. Hou, F. Nie, Y.Jiao and D. Yi. Effective Discriminative Feature Selection with Non-trivial Solutions. IEEE Trans. Neural Netw. Learning Syst. 27(4): 796-808 (2016) (TNNLS-DFS.pdf)

A novel supervised feature selection approach based on LDA has been proposed. More importantly, we use a new solving method with non-trival solutions.

3. C.Hou, F.Nie, D.Tao: Discriminative Vanishing Component Analysis. AAAI 2016: 1666-1672. (DVCA.pdf)  Analysis ofVCA in kernel view and proposing an extension of VCA with discriminativeinformation as DVCA.
4. D. Li, C. Hou, D. Yi. Multi-Bernoulli  smoother  for  multi  target tracking.  AEROSPACE SCIENCE AND TECHNOLOGY, 48(1): 234-245,2016. (CBMeMBer.pdf)
5. G. Lan, C. Hou and D. Yi. Robust Feature Selection via SimultaneousCapped l2-Norm and l2;1-Norm Minimization. ICBDA, 2016. (
SCM.pdf)

6.Z. Zhao, Y. Jiao, C. Hou and Y. Wu. Learning Partial Differential Equations forSaliency Detection. ICBDA, 2016. (LPDE.pdf)

2015

1.   C. Hou, F.Nie, D. Yi and D. Tao. Discriminative embeded clustering: A general framework for grouping high dimensional data. IEEE Trans. Neural Netw. Learning Syst.26(6): 1287-1299 () (TNNLS-DEC.pdf)(Matlab Code.zip)

We focus on the unsupervised problem of learning subspace for clustering, which have been employed sequencely in traditional applications. It is a genaral framework that can analyze many methods in a unified view.

2. H. Tao, C. Hou, F., Y. Jiao, D. Yi:Effective Discriminative Feature Selection with  Nontrivial Solutions. CoRRabs/1504.05408 (2015)

2014

1.   C. Hou, F.Nie, X. Li, D. Yi, Y. Wu. Joint Embedding Learning and Sparse Regression: A   Framework for Unsupervised Feature Selection. IEEE Transactions on Cybernetics,44(6): 793 – 804, 2014. (JELSR.pdf)(Matlab Code.zip)

We focus on the problem of unsupervised feature selection. A general framework which can joint embedding learning and sparse regression has been proposed.  It enables traditional dimensionality reduction method, i.e., spectral embedding, for feature selection by adding the sparse constaints.

2.   C. Hou, F.Nie, C. Zhang, D. Yi, Y. Wu. Multiple rank multi linear SVM for matrix data classification. Pattern Recognition 47 (1): 454-469, 2014. (MRMLSVM.pdf) (mrmlsvm.m)

We focus on the problem of enabling linear SVM manipulating matrix data directly. A general method which uses multiple rank multiple linear regression as the constraints of linear SVM has been presented.

3.   C. Hou, F.Nie, H. Wang, D. Yi, C. Zhang. Learning high-dimensional correspondence via manifold learning and local approximation. Neural Computing and Applications 24(7-8): 1555-1568, 2014. (LAMVU.pdf)

4.   H. Tao, C.Hou, D. Yi. Multiple-View Spectral Embedded Clustering Using a Co-training Approach. Computer Engineering and Networking, Lecture Notes in ElectricalEngineering Volume 277, 2014, pp. 979-987.(CoSEC.pdf)

5.   L. Li, Z.Zhao, C. Hou, Y. Wu. Semi-supervised Learning Using Nonnegative Matrix Factorization and Harmonic Functions. Computer Engineering and Networking, LectureNotes in Electrical Engineering Volume 277, 2014, pp. 321-328. (NMF-HF.pdf)

2013

1.   C. Hou, F.Nie, D. Yi, Y. Wu. Efficient image classification via multiple rank regression.IEEE Transactions on Image Processing, 22 (1): 340-352, 2013. (MRR.pdf)(MatlabCode.zip)

We focus on the problem of image data classification. We propose a novel multiple rank regresion model to classify image data directly. We also reveal the essence of the rank of regresion. It can be regarded as a general extension of traditional linear regression method.

2.   C. Hou, F. Nie,C. Zhang, Y. Wu. Learning a Subspace for Clustering via Pattern Shrinking. Information Processing & Management (IPM), 49(4):871-883, 2013.(SLPS.pdf) (Matlab Code.rar)(

IPM_Toy.zip)

We focus on the problem of learning subspace for clustering. We propose a nove strategy, named as pattern shinking to maintain the nonlinear structure in linear subspace learning.

3.   X. Liu, J.Yin, L. Wang, L. Liu, J. Liu, C. Hou, J. Zhang. An Adaptive Approach to Learning Optimal Neighborhood Kernels. IEEE Transactions on Cybernetics, 43(1): 371-384, 2013. (MK-ONJKL.pdf)

4.   S. Yang, C.Hou, C. Zhang, Y. Wu. Robust non-negative matrix factorization via joint sparse and graph regularization for transfer learning. Neural Computing and Applications, 23 (2): 541-559, 2013. (RSGTL.pdf)

5.   S. Yang, C.Hou, C. Zhang, Y. Wu, S. Weng. Robust non-negative matrix factorization via joint sparse and graph regularization. Proceedings of the 2013 International Joint Conference on Neural Networks, pp. 1-5, 2013.(RSGNMF.pdf)

2012

1.   杨伟,侯臣平,吴翊.一种利用Universum的半监督分类算法[J].计算机工程与应用, 2012, 48(6):155-157,176. (SSCU.pdf)

2.   F. Nie, S.Xiang, Y. Liu, C. Hou, C. Zhang. Orthogonal vs. uncorrelated least squares discriminant analysis for feature extraction. Pattern Recognition Letters 33(5): 485-491, 2012. (OLSDA.pdf)

3.   S. Yang, C.Hou, F. Nie, Y. Wu. Unsupervised maximum margin feature selection via L 2,1-norm minimization. Neural Computing and Applications 21 (7): 1791-1799, 2012. (UMMFSSC.pdf)

4.   S. Yang, M.Lin, C. Hou, C. Zhang, Y. Wu. A general framework for transfer sparse subspace learning. Neural Computing and Applications, 21 (7): 1801-1817, 2012. (TSSL.pdf)

2011

1.   C. Hou, F.Nie, F. Wang, C. Zhang, Y. Wu. Semisupervised Learning Using Negative Labels. IEEETransactions on Neural Networks, 22 (3), Mar. 2011, pp. 420-432. (NPL.pdf)(Matlab Code.rar)
We focus on the problem of semi supervised learning.  We proposed a new kind of label, i.e., negative label, and also demonstrate its effectiveness in supervised learning.

2.   C. Hou, F.Nie, D. Yi, Y Wu. Feature selection via joint embedding learning and sparseregression. Proceedings of the Twenty-Second international joint conference on Artificial Intelligence, vol. 2, pp. 1324-1329, 2011. (IJCAI11.pdf)

We focus on the problem of unsupervised feature selection.  It is a short version of the paper "Joint embedding learning and sparse regression: a general framework for unsupervised feature selection".

3.   C. Hou, F.Nie, Y. Wu. Semi-supervised Dimensionality Reduction via Harmonic Functions. Proceedingsof 8th International Conference, MDAI 2011, Changsha, Hunan, China, July 28-30,2011, pp. 91-102. (SSDRHF.pdf)

4.   X. Liu, C.Hou, Q. Luo, D. Yi. Uncovering community structure in social networks by clique correlation. Proceedings of 8th International Conference, MDAI 2011, Changsha, Hunan,China, July 28-30, 2011, pp. 247-258. (Spinglass.pdf)

5.   Y. Hu, Y.Wang, Y. Wu, Q. Li, C. Hou. Generalized Mahalanobis depth in the reproducing kernel Hilbert space. Statistical Papers 52 (3): 511-522, 2011. (GMD.pdf)

6.   W. Yang, C.Hou, Y. Wu. A Semi-supervised Method for Feature Selection. Proceedings of 2011International Conference on Computational and Information Sciences,2011, pp. 329-332. (SDFS.pdf)

7.   S. Yang, C.Hou, Y. Wu. GM-transfer: Graph-based model for transfer learning. Proceedings of 2011 First Asian Conference on Pattern Recognition, 2011, pp. 37-41. (GMTransfer.pdf)

2010

1.   C. Hou,C.  Zhang, Y. Wu, F. Nie. Multiple view semi-supervised dimensionality reduction, Pattern Recognition, Volume 43, Issue3, March 2010, Pages 720-730. (MVSSDR.pdf)(Matlab Code.rar)

We focus on the problem of semi supervised dimensionalty reduction on multiple view data.  It is proposed to use link information for multiple view dimensionality reduction.

2.   胡奎,侯臣平,吴翊.基于调和函数的张量数据维数约简[J].计算机工程与应用,2010,(22): 184-186.  (TDRHF.pdf)

2009

1.   C. Hou, C.Zhang, Y. Wu, Y. Jiao. Stable local dimensionality reduction approaches,Pattern Recognition, Volume 42, Issue 9, September 2009, Pages 2054-2066. (SLDRS.pdf)(Matlab Code.rar)

We focus on the problem of enhance the stability of local dimensionality reduction approaches. We have proposed a novel framework which can add global information for local methods.

2.   C. Hou, J.Wang, Y. Wu, D. Yi. Local linear transformation embedding, Neurocomputing, Volume 72, Issues 10–12, June 2009, Pages 2368-2378. (LLTE.pdf) (Matlab Code.rar)

We focus on the problem of  alleviate the local sensitivity of LLE. We compute the local approximation weight in a targent space.

3.   C. Hou, F.Nie, C. Zhang, Y. Wu. Learning an orthogonal and smooth subspace for image classification. Signal Processing Letters, IEEE, 16 (4): 303-306, 2009. (OSSL.pdf)(OSSL.zip)

We focus on the problem of subspace learning for image classification. We add spatial smooth regularizer and use orthogonal constraint in learning subspace.

4.   C. Hou, F.Nie, C. Zhang, Y. Wu. Learning a subspace for face image clustering via trace ratio criterion, Opt. Eng. 48(6): 060501, 2009. (SCTR .pdf)

5.   C. Zhang, F.Nie, S. Xiang, C. Hou. Soft constraint harmonic energy minimization fortransductive learning and its two interpretations. Neural processing letters,30 (2): 89-102, October 2009. (SLSC.pdf)

6.   侯臣平,吴翊,易东云. 新的流形学习方法统一框架及改进的拉普拉斯特征映射方法[J].计算机研究与发展,2009,46(4): 676-682. (ILE.pdf)

2008

1.   C. Hou, Y.Jiao, Y. Wu, D. Yi. Relaxed maximum-variance unfolding[J]. Optical Engineering, 2008, 47 (7): 077202-077202-12. (RMVU.pdf)

2.   侯臣平,易东云,吴翊. 基于最大差异延展算法的Web资源描述算法[J]. 系统仿真学报, 2008, 20:5553-5557. (MVUDiscription.pdf)

3.   C. Hou, Y.Wu, D. Yi, Y Jiao. Novel semisupervised high-dimensional correspondences learning method[J]. Optical Engineering, 2008, 47 (4): 047201-047201-10. (SMVUplusLLE.pdf)

2007

1.   C. Hou, Y.Wu. Learning High Dimensional Correspondences Based on Maximum VarianceUnfolding, International Conference on Mechatronics and Automation, pp.635-640,2007. (http://www.escience.cn/resources/third-party/ueditor/dialogs/attachment/fileTypeImages/icon_pdf.gifLMVU.pdf)



Scholars

Recently Visited

Similar Subject

Same institution

Similar Interests