US Patent Application 17804716. USER INTERFACE FOR OPERATING ARTIFICIAL INTELLIGENCE EXPERIMENTS simplified abstract

From WikiPatents
Jump to navigation Jump to search

USER INTERFACE FOR OPERATING ARTIFICIAL INTELLIGENCE EXPERIMENTS

Organization Name

Sony Group Corporation

Inventor(s)

Rory Douglas of Baltimore MD (US)

Dion Whitehead of San Diego CA (US)

Leon Barrett of Portland OR (US)

Piyush Khandelwal of Austin TX (US)

Thomas Walsh of Melrose MA (US)

Samuel Barrett of Cambridge MA (US)

Kaushik Subramanian of Metzingen (DE)

James Macglashan of Riverside RI (US)

Leilani Gilpin of Santa Cruz CA (US)

Peter Wurman of Acton MA (US)

USER INTERFACE FOR OPERATING ARTIFICIAL INTELLIGENCE EXPERIMENTS - A simplified explanation of the abstract

This abstract first appeared for US patent application 17804716 titled 'USER INTERFACE FOR OPERATING ARTIFICIAL INTELLIGENCE EXPERIMENTS

Simplified Explanation

- The patent application describes a user interface (UI) for analyzing machine learning experiments in a racing game environment. - The UI is web-based and allows researchers to easily track and visualize various aspects of their experiments. - The UI includes an experiment synchronized event viewer that synchronizes visualizations, videos, and timeline/metrics graphs. - This viewer allows researchers to see the detailed unfolding of experiments. - The UI also includes experiment event annotations that generate event annotations and display them in the synchronized event viewer. - The UI can be used to consolidate results across experiments and consider videos. - It provides a reusable dashboard to capture and compare metrics across multiple experiments.


Original Abstract Submitted

A user interface (UI), for analyzing model training runs, tracking and visualizing various aspects of machine learning experiments, can be used when training an artificial intelligent agent in, for example, a racing game environment. The UI can be web-based and can allow researchers to easily see the status of their experiments. The UI can include an experiment synchronized event viewer that can synchronizes visualizations, videos, and timeline/metrics graphs in the experiment. This viewer allows researchers to see how experiments unfold in great detail. The UI can further include experiment event annotations that can generate event annotations. These annotations can be displayed via the synchronized event viewer. The UI can be used to consider consolidated results across experiments and can further consider videos. For example, the UI can provide a reusable dashboard that can capture and compare metrics across multiple experiments.