Hierarchical semi-supervised factorization for learning the semantics

Bin Shen, Olzhas Makhambetov

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Most semi-supervised learning methods are based on extending existing supervised or unsupervised techniques by incorporating additional information from unlabeled or labeled data. Unlabeled instances help in learning statistical models that fully describe the global property of our data, whereas labeled instances make learned knowledge more human-interpretable. In this paper we present a novel way of extending conventional non-negativematrix factorization (NMF) and probabilistic latent semantic analysis (pLSA) to semi-supervised versions by incorporating label information for learning semantics. The proposed algorithm consists of two steps, first acquiring prior bases representing some classes from labeled data and second utilizing them to guide the learning of final bases that are semantically interpretable.

Original languageEnglish
Pages (from-to)366-374
Number of pages9
JournalJournal of Advanced Computational Intelligence and Intelligent Informatics
Volume18
Issue number3
Publication statusPublished - 2014

Fingerprint

Factorization
Semantics
Supervised learning
Labels
Statistical Models

Keywords

  • Non-negative matrix factorization
  • Probabilistic latent semantic analysis
  • Semi-supervised learning

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Cite this

Hierarchical semi-supervised factorization for learning the semantics. / Shen, Bin; Makhambetov, Olzhas.

In: Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol. 18, No. 3, 2014, p. 366-374.

Research output: Contribution to journalArticle

@article{741c92fa0f0a4cd8a24ff63feaccf972,
title = "Hierarchical semi-supervised factorization for learning the semantics",
abstract = "Most semi-supervised learning methods are based on extending existing supervised or unsupervised techniques by incorporating additional information from unlabeled or labeled data. Unlabeled instances help in learning statistical models that fully describe the global property of our data, whereas labeled instances make learned knowledge more human-interpretable. In this paper we present a novel way of extending conventional non-negativematrix factorization (NMF) and probabilistic latent semantic analysis (pLSA) to semi-supervised versions by incorporating label information for learning semantics. The proposed algorithm consists of two steps, first acquiring prior bases representing some classes from labeled data and second utilizing them to guide the learning of final bases that are semantically interpretable.",
keywords = "Non-negative matrix factorization, Probabilistic latent semantic analysis, Semi-supervised learning",
author = "Bin Shen and Olzhas Makhambetov",
year = "2014",
language = "English",
volume = "18",
pages = "366--374",
journal = "Journal of Advanced Computational Intelligence and Intelligent Informatics",
issn = "1343-0130",
publisher = "Fuji Technology Press",
number = "3",

}

TY - JOUR

T1 - Hierarchical semi-supervised factorization for learning the semantics

AU - Shen, Bin

AU - Makhambetov, Olzhas

PY - 2014

Y1 - 2014

N2 - Most semi-supervised learning methods are based on extending existing supervised or unsupervised techniques by incorporating additional information from unlabeled or labeled data. Unlabeled instances help in learning statistical models that fully describe the global property of our data, whereas labeled instances make learned knowledge more human-interpretable. In this paper we present a novel way of extending conventional non-negativematrix factorization (NMF) and probabilistic latent semantic analysis (pLSA) to semi-supervised versions by incorporating label information for learning semantics. The proposed algorithm consists of two steps, first acquiring prior bases representing some classes from labeled data and second utilizing them to guide the learning of final bases that are semantically interpretable.

AB - Most semi-supervised learning methods are based on extending existing supervised or unsupervised techniques by incorporating additional information from unlabeled or labeled data. Unlabeled instances help in learning statistical models that fully describe the global property of our data, whereas labeled instances make learned knowledge more human-interpretable. In this paper we present a novel way of extending conventional non-negativematrix factorization (NMF) and probabilistic latent semantic analysis (pLSA) to semi-supervised versions by incorporating label information for learning semantics. The proposed algorithm consists of two steps, first acquiring prior bases representing some classes from labeled data and second utilizing them to guide the learning of final bases that are semantically interpretable.

KW - Non-negative matrix factorization

KW - Probabilistic latent semantic analysis

KW - Semi-supervised learning

UR - http://www.scopus.com/inward/record.url?scp=84901017341&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84901017341&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:84901017341

VL - 18

SP - 366

EP - 374

JO - Journal of Advanced Computational Intelligence and Intelligent Informatics

JF - Journal of Advanced Computational Intelligence and Intelligent Informatics

SN - 1343-0130

IS - 3

ER -