

BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Computational Optimisation Group - ECPv6.15.11//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Computational Optimisation Group
X-ORIGINAL-URL:http://optimisation.doc.ic.ac.uk
X-WR-CALDESC:Events for Computational Optimisation Group
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20120101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=UTC:20130516T140000
DTEND;TZID=UTC:20130516T140000
DTSTAMP:20260504T190414
CREATED:20170124T102143Z
LAST-MODIFIED:20170124T102143Z
UID:584-1368712800-1368712800@optimisation.doc.ic.ac.uk
SUMMARY:Seminar: Alternating Maximization: Unifying Framework for 8 Sparse PCA Formulations
DESCRIPTION:Title: Alternating Maximization: Unifying Framework for 8 Sparse PCA FormulationsSpeaker: Dr. Selin AhipasaogluAffiliation: Singapore University of Technology and DesignLocation: CPSE seminar room (C615 Roderic Hill)Time: 2:00pm \nAbstract. Given a multivariate data set\, sparse principal component analysis (SPCA) aims to extract several linear combinations of the variables that together explain the variance in the data as much as possible\, while controlling the number of nonzero loadings in these combinations. In this paper we consider 8 different optimization formulations for computing a single sparse  loading vector; these are obtained by combining the following factors: we employ two norms for measuring variance (L2\, L1) and two sparsity-inducing norms (L0\, L1)\, which are used in two different ways (constraint\, penalty). Three of our formulations\, notably the one with L0 constraint and L1 variance\, have not been considered in the literature. We give a unifying reformulation which we propose to solve via a natural alternating maximization (AM) method. Besides this\, we provide a package which contains implementations for various parallel architectures and briefly discuss how these algorithms can be used to achieve better object recognition in challenging data sets. \nAbout the speaker. Selin Damla Ahipasaoglu is an Assistant Professor at the Singapore University of Technology and Design. She received her PhD in 2009 from Cornell University and specialises in developing algorithms  for large scale optimization problems\, in particular first-order methods for convex problems and applications in image processing. She held research positions at Princeton University and London School of Economics before joining SUTD. She is also a very keen teacher and an advocate of active and innovative classroom teaching for undergraduates.
URL:http://optimisation.doc.ic.ac.uk/event/seminar-alternating-maximization-unifying-framework-for-8-sparse-pca-formulations/
END:VEVENT
END:VCALENDAR