-
Notifications
You must be signed in to change notification settings - Fork 0
/
recruit.html
1 lines (1 loc) · 19 KB
/
recruit.html
1
<!DOCTYPE html><html lang="en"><head><meta charSet="utf-8"/><meta name="viewport" content="width=device-width"/><title>Recruit</title><meta content="RHOS-Recruitment" name="description"/><link href="https://http://mvig-rhos.com/recruit" rel="canonical"/><link href="/favicon.ico" rel="icon" sizes="any"/><link href="/icon.svg" rel="icon" type="image/svg+xml"/><link href="/apple-touch-icon.png" rel="apple-touch-icon"/><link href="/site.webmanifest" rel="manifest"/><meta content="Recruit" property="og:title"/><meta content="RHOS-Recruitment" property="og:description"/><meta content="https://http://mvig-rhos.com/recruit" property="og:url"/><meta content="Recruit" name="twitter:title"/><meta content="RHOS-Recruitment" name="twitter:description"/><meta name="next-head-count" content="14"/><meta charSet="utf-8"/><meta content="notranslate" name="google"/><link rel="preload" href="/_next/static/css/144e01f988e2575a.css" as="style"/><link rel="stylesheet" href="/_next/static/css/144e01f988e2575a.css" data-n-g=""/><noscript data-n-css=""></noscript><script defer="" nomodule="" src="/_next/static/chunks/polyfills-c67a75d1b6f99dc8.js"></script><script src="/_next/static/chunks/webpack-aa92048316e96af3.js" defer=""></script><script src="/_next/static/chunks/framework-7751730b10fa0f74.js" defer=""></script><script src="/_next/static/chunks/main-a9b50f256c2cfb57.js" defer=""></script><script src="/_next/static/chunks/pages/_app-18e1e0f85fa3b58e.js" defer=""></script><script src="/_next/static/chunks/526-2e1a63ee81ce1f73.js" defer=""></script><script src="/_next/static/chunks/342-08dd1fd52b1d6335.js" defer=""></script><script src="/_next/static/chunks/675-717e3cc8fb67a947.js" defer=""></script><script src="/_next/static/chunks/345-b3e1ef088eb31a90.js" defer=""></script><script src="/_next/static/chunks/pages/recruit-dadcff85fabd6f57.js" defer=""></script><script src="/_next/static/27sOB7OcKYXLbYhtS4T1I/_buildManifest.js" defer=""></script><script src="/_next/static/27sOB7OcKYXLbYhtS4T1I/_ssgManifest.js" defer=""></script></head><body class="bg-neutral-900"><div id="__next"><button aria-label="Menu Button" class="fixed top-2 right-2 z-40 rounded-md bg-orange-500 p-2 ring-offset-gray-800/60 hover:bg-orange-400 focus:outline-none focus:ring-0 focus-visible:ring-2 focus-visible:ring-orange-500 focus-visible:ring-offset-2 sm:hidden"><svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" aria-hidden="true" class="h-8 w-8 text-white"><path stroke-linecap="round" stroke-linejoin="round" d="M4 6h16M4 12h16m-7 6h7"></path></svg><span class="sr-only">Open sidebar</span></button><header class="fixed top-0 z-50 hidden w-full bg-neutral-900/50 p-4 backdrop-blur sm:block" id="headerNav"><nav class="flex justify-center gap-x-8"><a class="-m-1.5 p-1.5 rounded-md font-bold first-letter:uppercase hover:transition-colors hover:duration-300 focus:outline-none focus-visible:ring-2 focus-visible:ring-orange-500 sm:hover:text-orange-500 text-neutral-100 text-neutral-100" href="/">Home</a><a class="-m-1.5 p-1.5 rounded-md font-bold first-letter:uppercase hover:transition-colors hover:duration-300 focus:outline-none focus-visible:ring-2 focus-visible:ring-orange-500 sm:hover:text-orange-500 text-neutral-100 text-neutral-100" href="/recruit#about">about</a><a class="-m-1.5 p-1.5 rounded-md font-bold first-letter:uppercase hover:transition-colors hover:duration-300 focus:outline-none focus-visible:ring-2 focus-visible:ring-orange-500 sm:hover:text-orange-500 text-neutral-100 text-neutral-100" href="/recruit#demo">demo</a></nav></header><div class="relative flex h-screen-no w-screen items-center justify-center bg-neutral-100"><div class="flex flex-col z-10 w-full max-w-screen-lg p-4 lg:px-0 items-center text-center "><div class="h-20"></div><h1 class="text-3xl font-bold text-gray-800 sm:text-4xl lg:text-5xl p-4">MVIG-RHOS招新</h1></div></div><section class="bg-neutral-100 px-4 py-8 md:py-12 lg:px-8" id="about"><div class="mx-auto max-w-screen-lg"><div class="flex flex-col"><div class="grid justify-items-center pb-8"></div><div class="flex-1"><p>各位同学大家好!</p><p> 我是李永露(长聘教轨助理教授,博导),感谢关注上海交大RHOS实验室,RHOS隶属于电子信息与电气工程学院-清源研究院和MVIG实验室(与卢策吾教授co-supervise),我的基本情况如下:</p><ul><li><span class="text-lg"> •</span>在人工智能领域发表论文30余篇(TPAMI, NeurIPS, CVPR, ICCV, ECCV, etc);</li><li><span class="text-lg"> •</span>开发了开源系统<a class="underline text-sky-600" href="http://hake-mvig.cn/">HAKE</a>,探索以知识驱动的推理方式理解人类行为,其官网得到全球十几万次访问;</li><li><span class="text-lg"> •</span>NeurIPS Area Chair,曾获得NeurIPS杰出审稿人(2020、2021)、百度奖学金、WAIC云帆奖(明日之星、璀璨明星)、吴文俊奖优博(中国人工智能学会)、上海市优秀毕业生、2020华人学生AI百人(机器学习top-10)、上交85-杨元庆基金优秀博士等。</li></ul></div><div class="flex-1"><h2 class="text-xl font-bold text-neutral-800">实验室简介</h2><p> 我们的<b>目标</b>是做出类似C-3PO和R2-D2的智能(真实/模拟)机器人。实验室的研究<b>方向</b>包括:</p><ol><li>1. 具身智能:如何让机器人学习人类的技能并与人类交互?<div><p> a. 人类行为理解:如何从多模态信息(2D-3D-4D)中学习和理解复杂、模糊的人类行为(身体运动,人-物体/人/场景 交互)和物体概念;</p><p> b. 视觉推理: 如何从人类行为中挖掘、捕获和编码逻辑、因果关系;</p><p> c. 通用多模态基础模型:特别是以人为中心的感知与理解任务;</p><p> d. 基于认知的行为理解:与跨学科研究团队合作,探索大脑如何感知人类行为;</p></div></li><li>2. 人机交互(如:智能医院):与上海交通大学的医疗团队(医生和工程师)合作,开发辅助人类的机器人。</li></ol><br/><p>我们有非常高水平的博士/硕士团队,以及具有浓厚科研兴趣的实习生(数据截至2024.4):</p><div><p> ● 10+同学曾是交大本科所在系前五名;</p><p> ● 6名本科实习生以<b>一作</b>发表顶会(CVPR、ECCV);</p><p> ● 27名本科实习生以co-author发表顶会/顶刊(TPAMI、CVPR、NeurIPS、ECCV、ICCV);</p><p> ● 2名本科实习生前往<b>Stanford, UCB</b>等北美CS四大深造(硕博)。</p><p> ● 2名本科实习生获得商汤奖学金(全国25人/年)。</p></div><br/><p>我们追求自由与理性,努力营造团结、紧张、严肃、活泼的研究氛围:</p><div><p> ● 每周组会有<b>一对一</b>和<b>小组讨论</b>两种形式,根据实际需求进行。</p><p> ● 论文方面,我们追求顶级机器学习、计算机视觉、机器人方向的会议和期刊(如ICLR、 NeurlPS、ICML、CVPR、ICCV、ECCV、ICRA、CoRL、RSS、TPAMI、IJRR, T-RO等)。</p><p> ● 与工业届有密切的合作,积极推动我们的研究工作落地应用,也可以推荐实习生前往人工智能和机器人的工业界团队(如腾讯、华为、快手、商汤、非夕等)实习。</p><p> ● 足够的研究资金和硬件设施(50+ GPUs、机器人、VR、人体工程设备)。</p></div></div><br/><div class="flex-1"><h2 class="text-xl font-bold text-neutral-800">招生</h2><p> 实验室每年会招收多名博士生及硕士生,同时<b>长期招收实习生(本硕博均有)</b>。 英语水平、科研经历、代码能力和深度学习基础将会是你的加分项,但不是必需项。 只要你有强烈的自驱力和科研兴趣,能展示出有价值的科学研究<b>潜力</b>或优秀工程<b>技能</b>,欢迎加入我们!</p><br/><h3 class="text-xl font-bold text-neutral-800">本科生培养</h3><p>我们致力于“world-class research, world-class students”,给学生予充分的指导。我们的培养分为四个阶段:</p><div><p> ● Stage-0: 培训掌握基本算法和技术,包括深度学习/机器人基础课程的学习、基础工具的学习等。 可参见<a class="underline text-sky-600" href="https://mvig-rhos.gitbook.io/rhos-ke-yan-shou-ce/">Research_in_RHOS</a>。</p><p> ● Stage-1: 参与研究课题,在老师和高年级学生的带领下,发表一篇co-author论文,体验完整的研究过程: 好奇心--问题--思考--验证实验--动机--洞察力--反复试验--写论文--提交论文--如果不幸拒稿修改后recycle--准备最终版--发表论文--参加会议。</p><p> ● Stage-2: 提出一个好的问题,在指导下每周推进,独立发表一作顶会论文。</p><p> ● Stage-3: 提出一个高水平的idea,独立进行研究,与老师同学们互相学习。</p></div></div><br/><h3 class="text-xl font-bold text-neutral-800">未来发展</h3><p> ● 保研/直博:实习期间表现优异的同学优先保研,目前实验室研究生大部分来自本科实习生。 若实习期内表现突出且能够获得推免资格,实验室会竭尽全力为大家提供组内的保研机会。</p><p> ● 出国申请:每年实验室会帮助很多实习生同学准备申请工作, 包括科研训练、申请材料准备、暑研学校选择、北美导师选择、推荐信等, 实验室也已有很多同学去Stanford、MIT、CMU等北美名校的人工智能和机器人实验室暑研、读博。</p></div></div></section><section class="bg-neutral-100 px-4 py-8 md:py-12 lg:px-8" id="demo"><div class="mx-auto max-w-screen-lg"><div class="grid grid-cols-1 gap-y-4 py-8 first:pt-0 last:pb-0 md:grid-cols-4"><div class="col-span-1 flex justify-center md:justify-start"><div class="relative h-max"><h2 class="text-xl font-bold uppercase text-neutral-800">我们的工作</h2><span class="absolute inset-x-0 border-b-2 border-orange-400"></span><div></div></div></div><div class="col-span-1 flex flex-col md:col-span-3"><div class="flex flex-col"><h2 class="text-xl font-bold text-neutral-800">Robot Brain</h2><p>从数据和任务端重新定义具身智能学习的范式, 在有限、多模态、noisy、异构数据的条件下, 研究<b>可泛化、可解释且具有推理能力的机器人“大脑”</b>,与现有技术配合实现智能机器人应用。</p><div class="w-1/2"><span style="box-sizing:border-box;display:inline-block;overflow:hidden;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;position:relative;max-width:100%"><span style="box-sizing:border-box;display:block;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;max-width:100%"><img style="display:block;max-width:100%;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0" alt="" aria-hidden="true" src="data:image/svg+xml,%3csvg%20xmlns=%27http://www.w3.org/2000/svg%27%20version=%271.1%27%20width=%271170%27%20height=%27655%27/%3e"/></span><img alt="egopca" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" decoding="async" data-nimg="intrinsic" class="place-self-center" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%"/><noscript><img alt="egopca" src="/_next/static/media/image.699d1601.jpg" decoding="async" data-nimg="intrinsic" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%" class="place-self-center" loading="lazy"/></noscript></span></div><div class="w-1/4"><span style="box-sizing:border-box;display:inline-block;overflow:hidden;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;position:relative;max-width:100%"><span style="box-sizing:border-box;display:block;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;max-width:100%"><img style="display:block;max-width:100%;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0" alt="" aria-hidden="true" src="data:image/svg+xml,%3csvg%20xmlns=%27http://www.w3.org/2000/svg%27%20version=%271.1%27%20width=%271280%27%20height=%271707%27/%3e"/></span><img alt="egopca" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" decoding="async" data-nimg="intrinsic" class="place-self-center" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%"/><noscript><img alt="egopca" src="/_next/static/media/image-2.713e44dc.jpg" decoding="async" data-nimg="intrinsic" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%" class="place-self-center" loading="lazy"/></noscript></span></div><br/><h2 class="text-xl font-bold text-neutral-800">HAKE</h2><p><b>推理驱动的人类行为知识系统</b>:使智能体能够感知人类行为、推理人类行为逻辑、从人类行为中学习技能,并与物体和环境进行交互。</p><div class="w-3/4"><span style="box-sizing:border-box;display:inline-block;overflow:hidden;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;position:relative;max-width:100%"><span style="box-sizing:border-box;display:block;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;max-width:100%"><img style="display:block;max-width:100%;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0" alt="" aria-hidden="true" src="data:image/svg+xml,%3csvg%20xmlns=%27http://www.w3.org/2000/svg%27%20version=%271.1%27%20width=%27628%27%20height=%27348%27/%3e"/></span><img alt="hake" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" decoding="async" data-nimg="intrinsic" class="place-self-center" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%"/><noscript><img alt="hake" src="/_next/static/media/2022_hake2.0.38642608.jpg" decoding="async" data-nimg="intrinsic" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%" class="place-self-center" loading="lazy"/></noscript></span></div><br/><h2 class="text-xl font-bold text-neutral-800">OCL</h2><p><b>物体概念学习</b>:提出涉及物体属性、可供性等与人类行为紧密相关的物体概念, 以推动机器对物体的理解,并基于因果图模型提出了因果推理基准和基线模型。</p><video autoplay="" loop="" controls=""><source src="/media/demo_small.mp4" type="video/mp4"/></video><a class="underline text-sky-600" href="https://www.bilibili.com/video/BV1Vm4y1V7aC/?share_source=copy_web&vd_source=33c221d66435cf014ff6a86a1ddd62b8">Full demo on BiliBili</a><br/><h2 class="text-xl font-bold text-neutral-800">Pangea</h2><p>根据<b>动词分类层次结构</b>设计了动作语义空间,涵盖了大量人类行为,从而将多模态数据集聚合到一个统一的数据池中, 使用统一的标签系统。相应地,提出了一个在物理空间和语义空间之间进行双向映射的模型,以促进行为理解领域的“统一度量衡”。</p><video autoplay="" loop="" controls=""><source src="/media/demo_pangea.mp4" type="video/mp4"/></video><br/><h2 class="text-xl font-bold text-neutral-800">EgoPCA</h2><p>提出了一个全新的<b>手-物体交互理解框架</b>,通过对手-物体交互数据的探测、分析和重采样, 提供了更平衡而全面的预训练集、测试集和测试基准,并使用专门针对手-物体交互的预训练策略以及下游微调有效机制, 推动手-物交互理解的发展,以助力机器人物体操作技能学习。</p><div class="w-3/4"><span style="box-sizing:border-box;display:inline-block;overflow:hidden;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;position:relative;max-width:100%"><span style="box-sizing:border-box;display:block;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0;max-width:100%"><img style="display:block;max-width:100%;width:initial;height:initial;background:none;opacity:1;border:0;margin:0;padding:0" alt="" aria-hidden="true" src="data:image/svg+xml,%3csvg%20xmlns=%27http://www.w3.org/2000/svg%27%20version=%271.1%27%20width=%272528%27%20height=%271058%27/%3e"/></span><img alt="egopca" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" decoding="async" data-nimg="intrinsic" class="place-self-center" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%"/><noscript><img alt="egopca" src="/_next/static/media/teaser.f010e32b.png" decoding="async" data-nimg="intrinsic" style="position:absolute;top:0;left:0;bottom:0;right:0;box-sizing:border-box;padding:0;border:none;margin:auto;display:block;width:0;height:0;min-width:100%;max-width:100%;min-height:100%;max-height:100%" class="place-self-center" loading="lazy"/></noscript></span></div></div></div></div></div></section><div class="relative bg-neutral-900 px-4 pb-6 pt-12 sm:px-8 sm:pt-14 sm:pb-8"><div class="absolute inset-x-0 -top-4 flex justify-center sm:-top-6"><a class="rounded-full bg-neutral-100 p-1 ring-white ring-offset-2 ring-offset-gray-700/80 focus:outline-none focus:ring-2 sm:p-2" href="/#hero"><svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke-width="2" stroke="currentColor" aria-hidden="true" class="h-6 w-6 bg-transparent sm:h-8 sm:w-8"><path stroke-linecap="round" stroke-linejoin="round" d="M5 15l7-7 7 7"></path></svg></a></div><div class="flex flex-col items-center gap-y-6"><div id="pageview-script" class="text-sm text-neutral-700"><a href="https://www.revolvermaps.com/livestats/5r1om30zfoi/"><img src="//rf.revolvermaps.com/h/m/a/0/ff0000/128/0/5r1om30zfoi.png" width="256" height="128" alt="Map" style="border:0"/></a><script type="text/javascript" id="clstr_globe" src="//clustrmaps.com/globe.js?d=ko7teOw_sX7QKyWbHLxkMdyOA6BYkSEu0Fo1wnSs9QE"></script></div><span class="text-sm text-neutral-700">© Copyright 2022 MVIG-RHOS • Based on<!-- --> <a href="https://github.com/tbakerx/react-resume-template">tbakerx</a></span></div></div></div><script id="__NEXT_DATA__" type="application/json">{"props":{"pageProps":{}},"page":"/recruit","query":{},"buildId":"27sOB7OcKYXLbYhtS4T1I","nextExport":true,"autoExport":true,"isFallback":false,"scriptLoader":[]}</script></body></html>