{"id":7342,"date":"2020-12-05T18:07:08","date_gmt":"2020-12-05T10:07:08","guid":{"rendered":"https:\/\/cde.nus.edu.sg\/ece\/?post_type=nus-news&#038;p=7342"},"modified":"2024-07-31T16:13:10","modified_gmt":"2024-07-31T08:13:10","slug":"young-researcher-award_assistant-professor-feng-jiashi","status":"publish","type":"nus-news","link":"https:\/\/cde.nus.edu.sg\/ece\/news\/young-researcher-award_assistant-professor-feng-jiashi\/","title":{"rendered":"Young Researcher Award_Assistant Professor Feng Jiashi"},"content":{"rendered":"<p style=\"text-align: justify\">Congratulations to Assistant Professor Feng Jiashi, winner of the <strong>Young Researcher Award<\/strong> &#8211; among seven awardees at NUS University Awards.<\/p>\n<p style=\"text-align: justify\"><em>&#8220;To develop artificial general intelligence \u2014 teaching computers to see and think like humans is the very first step.&#8221;<\/em> &#8211;\u00a0 Assistant Professor Feng Jiashi.<\/p>\n<div id='gallery-1' class='gallery galleryid-7342 gallery-columns-1 gallery-size-full'><figure class='gallery-item'>\n\t\t\t<div class='gallery-icon landscape'>\n\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1279\" height=\"683\" src=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2020\/12\/YRA-Feng-Jiashi.png\" class=\"attachment-full size-full\" alt=\"Yra Feng Jiashi\" srcset=\"https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2020\/12\/YRA-Feng-Jiashi.png 1279w, https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2020\/12\/YRA-Feng-Jiashi-300x160.png 300w, https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2020\/12\/YRA-Feng-Jiashi-1024x547.png 1024w, https:\/\/cde.nus.edu.sg\/ece\/wp-content\/uploads\/sites\/3\/2020\/12\/YRA-Feng-Jiashi-768x410.png 768w\" sizes=\"auto, (max-width: 1279px) 100vw, 1279px\" \/>\n\t\t\t<\/div><\/figure>\n\t\t<\/div>\n\n<h3>Research Achievements<\/h3>\n<ul>\n<li style=\"text-align: justify\">Among the first in the world to develop a novel technique capable of recognising faces under challenging scenarios which was deployed in Panasonic\u2019s FacePRO and Ministry of Home Affair\u2019s video analytics system.<\/li>\n<li style=\"text-align: justify\">Developed the world\u2019s first AI model for multiple-person parsing and first dataset for facilitating research in this field. The finding was accorded the Best Student Paper Award at the ACM Multimedia Conference 2018.<\/li>\n<li style=\"text-align: justify\">Pioneered the development of a visual search artificial neural network model that presents similar performance and behaviour as the human brain for visual search tasks. The finding was published in <em>Nature Communications<\/em> in September 2018.<\/li>\n<li style=\"text-align: justify\">Revived distribution-alignment based domain adaptation techniques that are key in enabling the deployment of AI models in various domains of different characteristics. The finding was accorded the Best Paper Award from the \u2018Transferring and Adapting Source Knowledge in Computer Vision\u2019 workshop of the Association for the Advancement of Artificial Intelligence (AAAI) 2016.<\/li>\n<li style=\"text-align: justify\">Co-Principal Investigator for Singapore\u2019s first autonomous bus; raised $7 million funding from the National Research Foundation\u2019s Competitive Research Programme and Land Transport Authority.<\/li>\n<li style=\"text-align: justify\">Secured $5 million funding from global technology giants such as Adobe and Snap, as well as other agencies.<\/li>\n<\/ul>\n<h3>Publication Credits<\/h3>\n<ul>\n<li style=\"text-align: justify\">Published more than 280 articles and conference papers in leading journals and prestigious conferences.<\/li>\n<li style=\"text-align: justify\">Obtained over 12,000 total citations with a Hirsch index of 58.<\/li>\n<\/ul>\n<h3>International Standing<\/h3>\n<ul>\n<li style=\"text-align: justify\">Reviewer for more than 10 journals by the Institute of Electrical and Electronics Engineers, the Research Grants Council of Hong Kong and international conferences.<\/li>\n<li style=\"text-align: justify\">Area Chair for the International Conference on Learning Representations (2020), Neural Information Processing Systems (2020), British Machine Vision Conference (2019) and ACM Multimedia Conference (2017 to 2019).<\/li>\n<li style=\"text-align: justify\">Technical Program Chair for the International Conference on Multimedia Retrieval (2017).<\/li>\n<li style=\"text-align: justify\">Keynote\/invited speaker at several international conferences and workshops.<\/li>\n<\/ul>\n<h3>Awards and Accolades<\/h3>\n<ul>\n<li style=\"text-align: justify\">Top 10 \u2018Innovators under 35\u2019 in Asia, <em>MIT Technology Review<\/em> (2018).<\/li>\n<li style=\"text-align: justify\">Winner Award, Computer Vision and Pattern Recognition (2017); International Conference on Computer Vision (2017); and International Conference on Multimodal Interaction (2016).<\/li>\n<li style=\"text-align: justify\">Early Career Research Award, NUS (2017).<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Congratulations to Assistant Professor Feng Jiashi, winner of the Young Researcher Award &#8211; among seven awardees at NUS University Awards. &#8220;To develop artificial general intelligence \u2014 teaching computers to see and think like humans is the very first step.&#8221; &#8211;\u00a0 Assistant Professor Feng Jiashi. Research Achievements Among the first in the world to develop a<\/p>\n","protected":false},"author":31,"featured_media":7343,"parent":0,"menu_order":0,"template":"","meta":{"_acf_changed":false,"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-gradient":""}},"footnotes":""},"news_category":[37,47],"class_list":["post-7342","nus-news","type-nus-news","status-publish","has-post-thumbnail","hentry","news_category-ece","news_category-awards-achievements"],"acf":[],"_links":{"self":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/news\/7342","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/news"}],"about":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/types\/nus-news"}],"author":[{"embeddable":true,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/users\/31"}],"version-history":[{"count":2,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/news\/7342\/revisions"}],"predecessor-version":[{"id":15137,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/news\/7342\/revisions\/15137"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/media\/7343"}],"wp:attachment":[{"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/media?parent=7342"}],"wp:term":[{"taxonomy":"news_category","embeddable":true,"href":"https:\/\/cde.nus.edu.sg\/ece\/wp-json\/wp\/v2\/news_category?post=7342"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}