

BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Computational Optimisation Group - ECPv6.15.11//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Computational Optimisation Group
X-ORIGINAL-URL:http://optimisation.doc.ic.ac.uk
X-WR-CALDESC:Events for Computational Optimisation Group
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20120101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=UTC:20130620T140000
DTEND;TZID=UTC:20130620T140000
DTSTAMP:20260419T030834
CREATED:20170124T102142Z
LAST-MODIFIED:20170124T102142Z
UID:579-1371736800-1371736800@optimisation.doc.ic.ac.uk
SUMMARY:Seminar: Parallel block coordinate descent methods for huge-scale partially separable problems
DESCRIPTION:Title: Parallel block coordinate descent methods for huge-scale partially separable problemsSpeaker: Martin TakacAffiliation: School of Mathematics –  University of EdinburghLocation: CPSE Seminar roomTime: 2:00pm \nAbstract. In this work we show that randomized block coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially block separable smooth convex function and a simple block separable convex function. We give a generic algorithm and several variants thereof based on the way parallelization is performed. In all cases we prove iteration complexity results\, i.e.\, we give bounds on the number of iterations sufficient to approximately solve the problem with high probability. Our results generalize the intuitive observation that in the separable case the theoretical speedup caused by parallelization must be equal to the number of processors. We show that the speedup increases with the number of processors and with the degree of partial separability of the smooth component of the objective function. Our analysis also works in the mode when the number of blocks being updated at each iteration is random\, which allows for modelling situations with variable (busy or unreliable) number of processors. We conclude with some encouraging computational results applied to huge-scale LASSO and sparse SVM instances.  This is a joint work with Dr. Peter Richtarik\, University of Edinburgh. \nAbout the speaker.
URL:http://optimisation.doc.ic.ac.uk/event/seminar-parallel-block-coordinate-descent-methods-for-huge-scale-partially-separable-problems/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=UTC:20130620T160000
DTEND;TZID=UTC:20130620T160000
DTSTAMP:20260419T030834
CREATED:20170124T102141Z
LAST-MODIFIED:20170124T102141Z
UID:578-1371744000-1371744000@optimisation.doc.ic.ac.uk
SUMMARY:Seminar: Robust Data-Driven Approach in Decision Making Under Uncertainty
DESCRIPTION:Title: Robust Data-Driven Approach in Decision Making Under UncertaintySpeaker: Grani Adiwena HanasusantoAffiliation: Department of Computing – Imperial College LondonLocation: Room 301 William PenneyTime: 4:00pm \nAbstract. We investigate robust data-driven approach in stochastic optimization problems where partial knowledge on the exogenous uncertainties is available to the decision maker. In contrast to the traditional model-based approach\, a data-driven approach requires no assumptions on the underlying distribution of exogenous uncertainties. Estimation of conditional expectation is achieved using kernel regression scheme which evaluates the cost function solely at historical observations. If sparse historical observations are available\, however\, the estimation is inaccurate and the resulting decision performs poorly in out-of-sample tests. To alleviate this unfavourable outcome\, we ‘robustify’ the decision against estimation errors by utilizing techniques from robust optimization. We show that the arising min-max problem can be reformulated as a tractable conic program. We further extend the proposed approach to multi-period settings and introduce an approximate dynamic programming framework that retains the tractability of the formulation and that is amenable to efficient parallel implementation. The proposed approach is tested across several application domains and is shown to outperform various non-robust schemes in terms of standard statistical benchmarks. \nAbout the speaker. Grani Hanasusanto is a PhD student at the Department of Computing\, Imperial College London\, under the supervision of Dr. Daniel Kuhn. He obtained the BEng (Hons) degree in Electrical and Electronic Engineering from Nanyang Technological University\, Singapore\, and the MSc degree in Financial Engineering from National University of Singapore. His research interests are in numerical and computational methods and their applications.
URL:http://optimisation.doc.ic.ac.uk/event/seminar-robust-data-driven-approach-in-decision-making-under-uncertainty/
END:VEVENT
END:VCALENDAR