Machine learning in non-stationary environments : introduction to covariate shift adaptation /

This volume focuses on a specific non-stationary environment known as covariate shift, in which the distributions of inputs (queries) changes but the conditional distributions of outputs (answers) is unchanged, and presents machine learning theory algorithms, and applications to overcome this variet...

Full description

Saved in:
Bibliographic Details
Main Author: Sugiyama, Masashi, 1974-
Other Authors: Kawanabe, Motoaki
Format: Electronic eBook
Language:English
Published: Cambridge, Mass. : MIT Press, ©2012.
Series:Adaptive computation and machine learning.
Subjects:
Online Access: Full text (Emmanuel users only)

MARC

LEADER 00000cam a2200000 a 4500
001 in00000148187
006 m o d
007 cr cnu---unuuu
008 120409s2012 maua ob 001 0 eng d
005 20240702192339.3
016 7 |a 016034761  |2 Uk 
019 |a 794489207  |a 817078669  |a 961629362  |a 962696956  |a 988440888  |a 988450036  |a 992079675  |a 1037929255  |a 1038618374  |a 1055396331  |a 1065925703  |a 1081285003  |a 1153462521 
020 |a 9780262301220  |q (electronic bk.) 
020 |a 0262301229  |q (electronic bk.) 
020 |a 1280499222 
020 |a 9781280499227 
020 |z 9780262017091 
020 |z 0262017091 
024 8 |a 9786613594457 
035 |a (OCoLC)784949353  |z (OCoLC)794489207  |z (OCoLC)817078669  |z (OCoLC)961629362  |z (OCoLC)962696956  |z (OCoLC)988440888  |z (OCoLC)988450036  |z (OCoLC)992079675  |z (OCoLC)1037929255  |z (OCoLC)1038618374  |z (OCoLC)1055396331  |z (OCoLC)1065925703  |z (OCoLC)1081285003  |z (OCoLC)1153462521 
037 |a 22573/ctt58f2bs  |b JSTOR 
037 |a 8494  |b MIT Press 
037 |a 9780262301220  |b MIT Press 
040 |a N$T  |b eng  |e pn  |c N$T  |d YDXCP  |d E7B  |d CDX  |d COO  |d OCLCQ  |d DEBSZ  |d OCLCQ  |d IEEEE  |d OCLCF  |d OTZ  |d OCLCQ  |d JSTOR  |d OCLCQ  |d EBLCP  |d COD  |d OCLCQ  |d IDEBK  |d AZK  |d STBDS  |d AGLDB  |d OCLCQ  |d TOA  |d OCLCQ  |d MOR  |d PIFAG  |d ZCU  |d OCLCQ  |d MERUC  |d OCLCQ  |d IOG  |d NJR  |d U3W  |d OCLCQ  |d EZ9  |d STF  |d WRM  |d OCLCQ  |d VTS  |d MERER  |d OCLCQ  |d ICG  |d CUY  |d OCLCQ  |d INT  |d VT2  |d AU@  |d OCLCQ  |d MITPR  |d WYU  |d LEAUB  |d DKC  |d OCLCQ  |d UKCRE  |d UKAHL  |d AJS  |d OCLCQ  |d OCLCO  |d SFB  |d OCLCO  |d OCLCQ  |d OCLCO  |d OCLCL 
050 4 |a Q325.5  |b .S845 2012eb 
072 7 |a COM  |x 005030  |2 bisacsh 
072 7 |a COM  |x 004000  |2 bisacsh 
072 7 |a COM037000  |2 bisacsh 
072 7 |a COM004000  |2 bisacsh 
082 0 4 |a 006.3/1  |2 23 
100 1 |a Sugiyama, Masashi,  |d 1974-  |1 https://id.oclc.org/worldcat/entity/E39PBJgCGQdKGGMBPRGxy6RqcP 
245 1 0 |a Machine learning in non-stationary environments :  |b introduction to covariate shift adaptation /  |c Masashi Sugiyama and Motoaki Kawanabe. 
260 |a Cambridge, Mass. :  |b MIT Press,  |c ©2012. 
300 |a 1 online resource (xiv, 261 pages) :  |b illustrations 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
347 |a data file 
490 1 |a Adaptive computation and machine learning 
504 |a Includes bibliographical references and index. 
505 0 |a Foreword; Preface; I INTRODUCTION; 1 Introduction and Problem Formulation; 1.1 Machine Learning under Covariate Shift; 1.2 Quick Tour of Covariate Shift Adaptation; 1.3 Problem Formulation; 1.4 Structure of This Book; II LEARNING UNDER COVARIATE SHIFT; 2 Function Approximation; 2.1 Importance-Weighting Techniques for Covariate Shift Adaptation; 2.2 Examples of Importance-Weighted Regression Methods; 2.3 Examples of Importance-Weighted Classification Methods; 2.4 Numerical Examples; 2.5 Summary and Discussion; 3 Model Selection; 3.1 Importance-Weighted Akaike Information Criterion. 
505 8 |a 3.2 Importance-Weighted Subspace Information Criterion3.3 Importance-Weighted Cross-Validation; 3.4 Numerical Examples; 3.5 Summary and Discussion; 4 Importance Estimation; 4.1 Kernel Density Estimation; 4.2 Kernel Mean Matching; 4.3 Logistic Regression; 4.4 Kullback-Leibler Importance Estimation Procedure; 4.5 Least-Squares Importance Fitting; 4.6 Unconstrained Least-Squares Importance Fitting; 4.7 Numerical Examples; 4.8 Experimental Comparison; 4.9 Summary; 5 Direct Density-Ratio Estimation with Dimensionality Reduction; 5.1 Density Difference in Hetero-Distributional Subspace. 
505 8 |a 5.2 Characterization of Hetero-Distributional Subspace5.3 Identifying Hetero-Distributional Subspace by Supervised Dimensionality Reduction; 5.4 Using LFDA for Finding Hetero-Distributional Subspace; 5.5 Density-Ratio Estimation in the Hetero-Distributional Subspace; 5.6 Numerical Examples; 5.7 Summary; 6 Relation to Sample Selection Bias; 6.1 Heckman's Sample Selection Model; 6.2 Distributional Change and Sample Selection Bias; 6.3 The Two-Step Algorithm; 6.4 Relation to Covariate Shift Approach; 7 Applications of Covariate Shift Adaptation; 7.1 Brain-Computer Interface. 
505 8 |a 7.2 Speaker Identification7.3 Natural Language Processing; 7.4 Perceived Age Prediction from Face Images; 7.5 Human Activity Recognition from Accelerometric Data; 7.6 Sample Reuse in Reinforcement Learning; III LEARNING CAUSING COVARIATE SHIFT; 8 Active Learning; 8.1 Preliminaries; 8.2 Population-Based Active Learning Methods; 8.3 Numerical Examples of Population-Based Active Learning Methods; 8.4 Pool-Based Active Learning Methods; 8.5 Numerical Examples of Pool-Based Active Learning Methods; 8.6 Summary and Discussion; 9 Active Learning with Model Selection. 
505 8 |a 9.1 Direct Approach and the Active Learning/Model Selection Dilemma9.2 Sequential Approach; 9.3 Batch Approach; 9.4 Ensemble Active Learning; 9.5 Numerical Examples; 9.6 Summary and Discussion; 10 Applications of Active Learning; 10.1 Design of Efficient Exploration Strategies in Reinforcement Learning; 10.2 Wafer Alignment in Semiconductor Exposure Apparatus; IV CONCLUSIONS; 11 Conclusions and Future Prospects; 11.1 Conclusions; 11.2 Future Prospects; Appendix: List of Symbols and Abbreviations; Bibliography; Index. 
520 8 |a This volume focuses on a specific non-stationary environment known as covariate shift, in which the distributions of inputs (queries) changes but the conditional distributions of outputs (answers) is unchanged, and presents machine learning theory algorithms, and applications to overcome this variety of non-stationarity. 
588 0 |a Print version record. 
650 0 |a Machine learning. 
700 1 |a Kawanabe, Motoaki. 
758 |i has work:  |a Machine learning in non-stationary environments (Text)  |1 https://id.oclc.org/worldcat/entity/E39PCGXKgFtBBqkqPbKVBgVDdP  |4 https://id.oclc.org/worldcat/ontology/hasWork 
776 0 8 |i Print version:  |a Sugiyama, Masashi, 1974-  |t Machine learning in non-stationary environments.  |d Cambridge, Mass. : MIT Press, ©2012  |z 9780262017091  |w (DLC) 2011032824  |w (OCoLC)752909553 
830 0 |a Adaptive computation and machine learning. 
852 |b Online  |h ProQuest 
856 4 0 |u https://ebookcentral.proquest.com/lib/emmanuel/detail.action?docID=3339422  |z Full text (Emmanuel users only)  |t 0 
938 |a Askews and Holts Library Services  |b ASKH  |n AH25668528 
938 |a Coutts Information Services  |b COUT  |n 22291156 
938 |a EBL - Ebook Library  |b EBLB  |n EBL3339422 
938 |a ebrary  |b EBRY  |n ebr10547396 
938 |a EBSCOhost  |b EBSC  |n 445720 
938 |a ProQuest MyiLibrary Digital eBook Collection  |b IDEB  |n 359445 
938 |a Oxford University Press USA  |b OUPR  |n EDZ0000155739 
938 |a YBP Library Services  |b YANK  |n 7594876 
947 |a FLO  |x pq-ebc-base 
999 f f |s f0b48cc4-5484-49c5-8c7a-bb7950df7efc  |i 33b1879a-8276-4e34-b773-722b4d945b3b  |t 0 
952 f f |a Emmanuel College  |b Main Campus  |c Emmanuel College Library  |d Online  |t 0  |e ProQuest  |h Other scheme 
856 4 0 |t 0  |u https://ebookcentral.proquest.com/lib/emmanuel/detail.action?docID=3339422  |y Full text (Emmanuel users only)