Address: Beytepe Campus, Ankara, Turkey TR-06800
e-mail: erkut at cs dot hacettepe dot edu dot tr
Phone: +90 312 297 7500 / 147
Fax: +90 (312) 297 7502
My research centers on the areas of computer vision and machine learning. I believe the right algorithms and representations are the ones that take into account the contextual influences. Thus, the research objective that my students and I pursue is to incorporate different kinds of context (spatial, temporal and/or cross-modal) into all levels of visual processing from low to intermediate and high-level vision.
Current research interests: Visual Saliency Prediction, Automatic Image Description, Video/Photoset Summarization, Image Filtering, Image Editing
Supérieure des Télécommunications
Middle East Technical University
University of California
Oct. 2007 - Dec. 2007
Virginia Bioinformatics Institute, Virginia Tech
Jul. 2004 - Aug. 2004
[January 2020]: Our joint work with the Cognition, Learning and Robotics (CoLoRs) lab at Bogazici University on reasoning about action effects on articulated multi-part objects has been accepted to ICRA 2020.
[November 2019]: Our work on weakly-supervised dynamic saliency prediction has been accepted for publication in Signal Processing: Image Communication.
[October 2019]: Our work on manipulating transient attributes of natural scenes via hallucination has been accepted for publication in ACM Transactions on Graphics.
[September 2019]: Our work about reasoning on procedural data is accepted to CoNLL 2019: "Procedural Reasoning Networks for Understanding Multimodal Procedures".
[April 2019]: I will give a tutorial on "Multimodal Learning with Vision and Language" together with Aykut Erdem at IPTA 2019.
[February 2019]: Our joint work with ICON lab at UMRAM, Bilkent University on multi-contrast MRI synthesis with GANs has been accepted for publication in IEEE Transactions on Medical Imaging.
[December 2018]: I will give a talk on Integrated Vision and Language at ITURO 2019.
[December 2018]: I have received The Young Researcher Award given by Turkish Academy of Sciences.
[August 2018]: Our work on manipulating transient attributes of natural scenes via hallucination is out on arXiv.
[August 2018]: Our work on multimodal machine comprehension is accepted to EMNLP 2018: "RecipeQA: A Challenge Dataset for Multimodal Comprehension of Cooking Recipes". Read our paper, download the data, and submit your predictions at our project website.
[March 2018]: Our joint project with Lucia Specia from the University of Sheffield on "A Multimodal and Multilingual Framework for Video Captioning" has received financial support from TUBITAK - British Council’s Newton-Katip Çelebi Fund Institutional Links Grant Programme.
[March 2018]: I am invited to Dagstuhl Seminar "Joint Processing of Language and Visual Data for Better Automated Understanding".
[February 2018]: Our joint work with ICON lab at UMRAM, Bilkent University on utilizing GANs for multi-contrast MRI synthesis is out on arXiv.
[January 2018]: New TUBITAK 1001 project on "Using Synthetic Data for Deep Person Re-Identification", in partnership with HUCG (Hacettepe University Computer Graphics and Game Studies) group.
Project Duration: 2 years (2018-2020)
Sponsors: TUBITAK and British Council - Newton-Katip Çelebi Fund Institutional Links Grant Programme (Award# 217E054)
Project Duration: 2 years (2018-2020)
Sponsors: TUBITAK 1001 - Support Program for Scientific and Technological Research Projects (Award# 217E029)
Project Duration: 3 years (2014-2017)
Sponsors: TUBITAK 1001 - Support Program for Scientific and Technological Research Projects (Award# 113E116) and European Union under European Cooperation in Science and Technology (COST) Programme (ICT COST IC1037 Action)
Project Duration: 3 years (2012-2015)
Sponsors: TUBITAK 3501 - Career Development Program (Award# 112E146)
Project Duration: 3 years (2017-2020)
Sponsors: TUBITAK 1003 - Primary Subjects R&D Funding Program (Award# 116E685)
Project Duration: 3 years (2016-2019)
Sponsors: TUBITAK 1007 - Public Institutions Research Funding Program (Award# 114G028)