Research /cs/ en Robots can save us if they can see us: Heckman receives CAREER award /cs/2024/10/15/robots-can-save-us-if-they-can-see-us-heckman-receives-career-award <span>Robots can save us if they can see us: Heckman receives CAREER award</span> <span><span>Emily Adams</span></span> <span><time datetime="2024-10-15T14:07:19-06:00" title="Tuesday, October 15, 2024 - 14:07">Tue, 10/15/2024 - 14:07</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/2024-10/MARBLE%20robot%20edgar%20mine.JPG?h=95f4d75d&amp;itok=s5T8XTud" width="1200" height="600" alt="A SPOT robot with a light enters a dark mine tunnel"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/457"> Research </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <a href="/cs/grace-wilson">Grace Wilson</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p dir="ltr"><span>Autonomous robots could save human lives more easily if they could “see” and react better in adverse environmental conditions. By pursuing the possibilities of using millimeter wave radar for robotic perception,&nbsp;</span><a href="/cs/christoffer-heckman" rel="nofollow"><span>Christoffer Heckman</span></a><span> is making this fundamental shift possible.</span></p><p dir="ltr"><span>An associate professor of computer science at ֱ Boulder, Heckman will receive $600,000 over the next five years through the National Science Foundation's CAREER award for this research.</span></p><p dir="ltr"><span>Currently, most robots use sensors based on the visible spectrum of light, like cameras or lasers. In environments with smoke, fog or dust, however, visible light bounces off these particles.</span></p><p dir="ltr"><span>Robots, like humans, can't plan their movements accurately if they don't know where they are or what is around them.</span></p><p dir="ltr"><span>"Humans operating in a visually degraded environment are in trouble. We cannot solve that problem, but incorporating millimeter wave radar could enable our robots to do things that even humans can't do,"&nbsp; Heckman said.</span></p><p dir="ltr"><span>This is because millimeter waves pass through smoke, fog and dust.&nbsp;</span></p><p class="lead"><span>A new path</span></p><p dir="ltr"><span>Traditionally, Heckman explained, radar has been viewed with skepticism for these kinds of tasks. The sensors have been too large and energy-intensive for agile robots. The long wavelength of radar creates complex, confused signals.</span></p><p dir="ltr"><span>With the advent of new, smaller system-on-a-chip radar sensors, the traditional energy and size limitations have been removed. This leaves the complexity of radar waveform signals.</span></p><p dir="ltr"><span>"This is a fascinating problem," Heckman explained. "People really understand how radar works, down to equations that have existed for almost a century, but radar can be difficult to precisely interpret in cluttered environments. It bounces around within an enclosed area, and can pass right through small objects."</span></p><p dir="ltr"><span>Heckman's solution is to fuse the knowledge we have about electromagnetic waves with supervised machine learning.</span></p><p dir="ltr"><span>Datasets from high-fidelity optical sensors are paired with low-fidelity radar signals of the same scene. Machine learning then cleans the radar signal to match the high-fidelity scene. This training then can be used to build clear radar reconstructions of environments where optical sensors are obscured.</span></p><p dir="ltr"><span>This powerful synthesis of physics and computer science stands to dramatically improve the capability of radar as a perception sensor.</span></p><p class="lead"><span>Beyond sensing</span></p><p dir="ltr"><span>Heckman has further plans as well. He wants to use this advance to support quick and accurate actions and replanning for autonomous systems.</span></p><p dir="ltr"><span>Robotic thinking has traditionally followed the saying "sense, plan, act." A robot understands a scene, plans its route according to its inputs, and acts on that plan. Segmenting these activities, however, can lead to slow movement and inability to react to changes.</span></p><p dir="ltr"><span>Heckman seeks to use radar in conjunction with optical and lidar sensors to improve navigation strategies as a robot is navigating a space, allowing it to respond more quickly to changes.</span></p><p dir="ltr"><span>Robots that can plan for themselves better and can see into obscured spaces have a valuable role in search-and-rescue, firefighting and space missions.</span></p><p dir="ltr"><span>Heckman's MARBLE team has&nbsp;</span><a href="/engineering/2023/11/17/building-next-generation-autonomous-robots-serve-humanity" rel="nofollow"><span>used robots to explore dark caves</span></a><span> through the DARPA Subterranean Challenge and as a firefighting assistant finding active embers. As the research advances made possible by this CAREER Award take shape, where will robots be able to see next?&nbsp;</span></p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <blockquote class="ucb-article-blockquote"> <div class="ucb-article-blockquote-icon font-gold"> <i class="fa-solid fa-quote-left"></i> </div> <div class="ucb-article-blockquote-text"> <div>Humans operating in a visually degraded environment are in trouble. We cannot solve that problem, but incorporating millimeter wave radar could enable our robots to do things that even humans can't do." - Chris Heckman</div> </div></blockquote> </div> </div> </div> </div> </div> </div> <div>Radar breakthrough in robotic sensing to help systems see and act in smoke, darkness recognized by $600,000 National Science Foundation award.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/cs/sites/default/files/styles/large_image_style/public/2024-10/MARBLE%20robot%20edgar%20mine.JPG?itok=sYDLZxSI" width="1500" height="1000" alt="A SPOT robot with a light enters a dark mine tunnel"> </div> <span class="media-image-caption"> <p>A SPOT robot with a light enters a dark mine tunnel.</p> </span> </div> <div>On</div> <div>White</div> Tue, 15 Oct 2024 20:07:19 +0000 Emily Adams 2507 at /cs Video - ChatGPT: Fear, Hype, or Hope? Education and research practices and ethics in the generative AI era /cs/2023/04/20/video-chatgpt-fear-hype-or-hope-education-and-research-practices-and-ethics-generative-ai <span>Video - ChatGPT: Fear, Hype, or Hope? Education and research practices and ethics in the generative AI era</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-04-20T17:26:18-06:00" title="Thursday, April 20, 2023 - 17:26">Thu, 04/20/2023 - 17:26</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/web-ex-presizes_6.png?h=4116b828&amp;itok=cxak5T1b" width="1200" height="600" alt="ChatGPT crowd watch three panelists discuss"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/457"> Research </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/519" hreflang="en">Jim Martin</a> <a href="/cs/taxonomy/term/439" hreflang="en">Research</a> <a href="/cs/taxonomy/term/518" hreflang="en">Tom Yeh</a> </div> <a href="/cs/grace-wilson">Grace Wilson</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>Three leading experts&nbsp;discuss how the disruptive and powerful elements of ChatGPT and other generative AI stand to transform our world.&nbsp;Jim Martin clarifies what a large language model like ChatGPT actually is, Diane Sieber urges&nbsp;the creation of norms around the usage of these tools while Tom Yeh focuses on its potential impacts on education.&nbsp;</p> <p>[video:https://youtu.be/6nkKFmFOoOE]</p> <p>&nbsp;</p> <h2>Panelists:&nbsp;</h2> <ul> <li><a href="/cs/james-martin" rel="nofollow">Jim Martin</a> (Department of Computer Science and Institute of Cognitive Science), an expert on natural language processing and large language models</li> <li><a href="/herbst/diane-sieber" rel="nofollow">Diane Sieber</a> (Herbst Program for Engineering, Ethics &amp; Society), a pioneer in education bridging technology, humanities and arts</li> <li><a href="/cs/tom-yeh" rel="nofollow">Tom Yeh</a> (computer science), a leading researcher in human-computer interaction who has studied the use of generative AI in introductory programming and K-12 settings.&nbsp;</li> </ul> <h2>Moderator:</h2> <ul> <li><a href="/cs/bobby-schnabel" rel="nofollow">Bobby Schnabel</a>, external chair of computer science, founding director of the ATLAS Institute and former CEO of the Association for Computing Machinery.</li> </ul></div> </div> </div> </div> </div> <div>Through this panel discussion attended by over 300 people from the university and general public, hear from leading experts on the technical areas underlying ChatGPT and other generative AI, the uses of generative AI in university and K-12 education, and the ethical and societal issues associated with generative AI tools.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 20 Apr 2023 23:26:18 +0000 Anonymous 2250 at /cs AI has social consequences, but who pays the price? Tech companies’ problem with ‘ethical debt’ /cs/2023/04/20/ai-has-social-consequences-who-pays-price-tech-companies-problem-ethical-debt <span>AI has social consequences, but who pays the price? Tech companies’ problem with ‘ethical debt’</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-04-20T17:05:20-06:00" title="Thursday, April 20, 2023 - 17:05">Thu, 04/20/2023 - 17:05</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/web-ex-presizes_5.png?h=b61f7220&amp;itok=dsAp00hB" width="1200" height="600" alt="Two people's silhouettes made of circuits flank an old drawing of a factory"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/457"> Research </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/517" hreflang="en">Casey Fiesler</a> <a href="/cs/taxonomy/term/439" hreflang="en">Research</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Casey Fiesler, Associate Professor of Information Science with a courtesy appointment in Computer Science, writes for The Conversation about how we can tackle possible negative consequences and societal harms from AI development. Links to external article. </div> <script> window.location.href = `https://theconversation.com/ai-has-social-consequences-but-who-pays-the-price-tech-companies-problem-with-ethical-debt-203375`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 20 Apr 2023 23:05:20 +0000 Anonymous 2248 at /cs Computer science PhD student leads research into secrets of honeycomb formation /cs/2023/04/20/computer-science-phd-student-leads-research-secrets-honeycomb-formation <span>Computer science PhD student leads research into secrets of honeycomb formation</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-04-20T16:37:56-06:00" title="Thursday, April 20, 2023 - 16:37">Thu, 04/20/2023 - 16:37</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/6.jpg?h=7c43dcef&amp;itok=43ouv7Ja" width="1200" height="600" alt="Honey bee working on honeycomb"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/457"> Research </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/437" hreflang="en">Orit Peleg</a> <a href="/cs/taxonomy/term/439" hreflang="en">Research</a> </div> <span>Josh Rhoten</span> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Researchers in the Department of Computer Science and BioFrontiers Institute are studying honeycomb formation in bees with the hope of one day recreating the same intricate and impressive hexagonal structures for other uses.</div> <script> window.location.href = `/engineering/2023/04/12/computer-science-phd-student-leads-research-secrets-honeycomb-formation`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 20 Apr 2023 22:37:56 +0000 Anonymous 2246 at /cs Talking with the fireflies: Orit Peleg receives CAREER Award and Alfred P. Sloan Fellowship /cs/2023/02/17/talking-fireflies-orit-peleg-receives-career-award-and-alfred-p-sloan-fellowship <span>Talking with the fireflies: Orit Peleg receives CAREER Award and Alfred P. Sloan Fellowship</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-02-17T10:36:27-07:00" title="Friday, February 17, 2023 - 10:36">Fri, 02/17/2023 - 10:36</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/orit-peleg-career.png?h=43fa5d85&amp;itok=N4B_LMi3" width="1200" height="600" alt="Orit Peleg and fireflies"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/457"> Research </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/506" hreflang="en">CAREER awards</a> <a href="/cs/taxonomy/term/437" hreflang="en">Orit Peleg</a> <a href="/cs/taxonomy/term/439" hreflang="en">Research</a> </div> <a href="/cs/grace-wilson">Grace Wilson</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>If we as humans can understand how fireflies in swarms synchronize, we can understand their communication more deeply. Their on-off flickers of light are more similar to our binary computational logic than the tremulous variations of a frog croak or pitches of bugling elk calls.&nbsp;&nbsp;</p> <p>Fireflies' elegant, distributed communication systems could eventually help us with our own telecommunications through new ideas about compressing information and distributed networks.&nbsp;</p> <p>Assistant Professor of computer science <a href="/biofrontiers/orit-peleg" rel="nofollow">Orit Peleg</a> has just received $900,000 over the next five years to learn how fireflies in a swarm synchronize their lighting displays. The funding was provided by a National Science Foundation CAREER award, a highly prestigious early-career grant for junior faculty members.&nbsp;</p> <p>Peleg, a member of the <a href="/biofrontiers/" rel="nofollow">Biofrontiers Institute</a>&nbsp;and <a href="/cs/" rel="nofollow">Department of Computer Science</a>, seeks to create testable theories about animal communication with her lab by merging tools from physics, biology, math and computer science.&nbsp;</p> <p>Female fireflies judge the blinking displays of males to determine their mate. When there are swarms of small, blinking insects that can stretch over miles in the dark, any way to cut down on the visual clutter is important, and this is where synchronizing their flash patterns comes together.</p> <p>"There are some really interesting questions about how really similar signals on the level of individual fireflies can result in a different collective signal," Peleg said.&nbsp;</p> <p>Some groups of fireflies begin to flash in a burst together and stop the burst together, while others flash in a matching pattern but offset in time from one another so that the overall swarm always has some members flashing.&nbsp;</p> <p>Currently our telecommunication networks and other manmade technology require synchronization to a central clock, which means that the network is really sensitive to failures of that clock.</p> <p>"But that's not how biology does it. Biology achieves synchronization in a distributed way, which is more robust to failures of individual nodes," Peleg said&nbsp;</p> <h2>Changing the Working Model</h2> <p>We currently describe fireflies' brilliant displays of synchronizing bioluminescence through mathematical models that don't fully account for the possible agency of fireflies to change their lighting patterns in response to other fireflies.&nbsp;</p> <p>The models are also tied to limited experimental data, giving Peleg and her lab rich ground for research and experimentation.&nbsp;</p> <p>In the field, the lab sets up dark tents in the middle of the firefly swarms. They bring fireflies inside and use LED lights to mimic their signal patterns. They've already seen that, by changing the LED flashes, they can change the responses of living fireflies.&nbsp;</p> <p>"They kind of treat it as if it were another firefly. It's really fun to watch how the firefly responds and communicates back to the artificial light," Peleg said.&nbsp;</p> <p>They have also had success in training fireflies to create light patterns they have never been observed making in the wild in response to the LED light's patterns.&nbsp;</p> <p>Outside of the carefully controlled environment in the tent, the lab will also use low-cost video equipment throughout the swarms to create robust 3-D video that can be analyzed and turned into 3D models that match the observed behavior of the fireflies.</p> <p>By creating a model from field-data, Peleg can create more testable and verifiable theories around the ways that fireflies manage their synchronization.&nbsp;</p> <p>The field data can be used to create simulations of firefly lighting patterns where the actions of real fireflies placed in the system impact the response of the program's lights, which involves advanced image processing and hardware.&nbsp;</p> <p>In addition, Peleg is excited to use the data created from these field recordings to offer a class called "Physics, Artificial Intelligence, and Generative Art of Agent-Based Models," where undergraduate and graduate students will be able to craft their own visualizations of the data captured by the cameras in the field. By bringing experiential learning to the forefront, Peleg hopes to support their joy of learning.</p></div> </div> </div> </div> </div> <div>Assistant Professor of computer science Orit Peleg has just received $900,000 over the next five years to learn how fireflies in a swarm synchronize their lighting displays. She's using LEDs, VR and big tents in the wilderness to signal to the fireflies... and they're signaling back. </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 17 Feb 2023 17:36:27 +0000 Anonymous 2203 at /cs Keeping the unseen safe: Improving digital privacy for blind people /cs/2021/11/19/keeping-unseen-safe-improving-digital-privacy-blind-people <span>Keeping the unseen safe: Improving digital privacy for blind people</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2021-11-19T11:19:51-07:00" title="Friday, November 19, 2021 - 11:19">Fri, 11/19/2021 - 11:19</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/untitled_design-8.png?h=b96eee4b&amp;itok=yTh3flji" width="1200" height="600" alt="An illustration of a blurred blue eye with an in-focus white keyhole in the black pupil"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/457"> Research </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/459" hreflang="en">Danna Gurari</a> </div> <a href="/cs/grace-wilson">Grace Wilson</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/cs/sites/default/files/styles/large_image_style/public/article-image/danna_0.jpg?itok=5a4iDJup" width="1500" height="1000" alt="Danna Gurari"> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p class="text-align-right" dir="ltr"> </p><div class="image-caption image-caption-right"> <p class="text-align-right">Associate Professor Danna Gurari </p></div> <p dir="ltr">Blind people, like sighted people, post on Instagram, <a href="https://www.huffpost.com/entry/online-dating-while-blind_l_5cab919fe4b02e7a705bf6cc" rel="nofollow">swipe on Tinder</a>, and text photos of their children to a group chat. They also use photos to learn about their visual surroundings.&nbsp;</p> <p dir="ltr">Blind users often share images with identification software such as Microsoft's Seeing AI, Be My Eyes and TapTapSee to learn about their visual surroundings. There's a high demand too. <a href="https://news.microsoft.com/bythenumbers/en/seeing-ai" rel="nofollow">Seeing AI, for instance, has been used over 20 million times</a>.</p> <p dir="ltr">When blind people share photos, however, there is an added risk that they could unknowingly capture information considered private, such as a pregnancy test or a return address.</p> <p dir="ltr">To Assistant Professor Danna Gurari, this shouldn't have to be a concern.</p> <p dir="ltr">Gurari, the founding director of the Image and Video Computing group in the Department of Computer Science, is part of a cross-institutional team that has been awarded over $1 million through a Safe and Trustworthy Cyberspace (SaTC) grant from the National Science Foundation to study the issue.&nbsp;</p> <p dir="ltr">Currently, blind people must either trust friends or family members to vet their images for private information before sharing publicly, which can have social repercussions of its own. Or they can accept the risk to their privacy when they post.&nbsp;</p> <p dir="ltr">The goal of the team's four-year interdisciplinary project is to create a novel system that can alert users when private information is present in an image and, if the blind person wants to, obscure it.</p> <p dir="ltr">Working with human-centered computing expert Leah Findlater from the University of Washington and privacy expert Yang Wang from the University of Illinois at Urbana-Champaign, Gurari's group is leading the automatic analysis of images for the project. Their goal is to turn the desires of users and theories of private information into actionable knowledge.</p> <p dir="ltr">This comes with a number of challenges, both technical and philosophical.&nbsp;</p> <p dir="ltr">Because AI makes mistakes, you have to be careful how certain you make an analysis sound.</p> <p>"We really want to endow the appropriate level of trust but also give decision-making power," Gurari said.</p> <p dir="ltr">The Image and Video Computing group is creating ways to share what private information might be present in an image and let the user decide to use the image as-is, discard it, or obscure the private information and then share it.</p> <p dir="ltr">The other problem to solve for Gurari's group is how to determine what the most prominent object in an image is and obscure everything else.&nbsp;</p> <p dir="ltr">Because blind people often share photos for object identification, this feature could&nbsp;reduce the amount of private information introduced during this straight-forward task.&nbsp;</p> <p dir="ltr"> </p><div class="image-caption image-caption-none"> <p></p> <p>Illustration of envisioned user interaction pipeline for empowering users to safeguard private content in their pictures and videos. (a) For the general use case, the&nbsp;tool will notify the user about what private content is detected and then provide a choice to either discard the media, share it as-is, or share an edited version where private content (teel mask overlaid on image) is obfuscated. (b) For the scenario where a user wants assistance to learn about an object, the tool will share an edited version with all content outside of the foreground object (teel mask overlaid on image) obfuscated.</p> <p dir="ltr"> </p></div> <p dir="ltr">Gurari's team will be focusing on creating algorithms robust enough to counteract image blur and other properties common for images taken by blind photographers. The team must also craft algorithms that don’t need to be trained on specific objects to see them as important.&nbsp;</p> <p dir="ltr">This object identification riddle is one that has haunted much of computer vision's history, termed the "long-tail problem" for the graph it produces.&nbsp;</p> <p dir="ltr">Computers usually have very low levels of object detection accuracy until they have been trained on thousands of images, but here they must understand the significance of an object in only a few frames.</p> <p>And, as with other assistive technologies, the benefits of these algorithms could go far beyond their original purpose. From product photography, which is built on isolating prominent objects, to alerting sighted users of private information they didn't notice, the project has great potential benefit for building a safe and trustworthy cyberspace for all.</p> <p>&nbsp;</p></div> </div> </div> </div> </div> <div>Blind people, like sighted people, publicly share photos, but have limited options for checking if they've captured something they consider private. Assistant Professor Danna Gurari, as a co-lead for an over-$1 million NSF grant, is working to change this.<br> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 19 Nov 2021 18:19:51 +0000 Anonymous 1969 at /cs