18376717. AUTOMATIC DATA OBFUSCATION simplified abstract (Insight Direct USA, Inc.)

From WikiPatents
Revision as of 03:31, 16 April 2024 by Wikipatents (talk | contribs) (Creating a new page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

AUTOMATIC DATA OBFUSCATION

Organization Name

Insight Direct USA, Inc.

Inventor(s)

Michael Griffin of Wayland MA (US)

Catherine Jean Snell of Durham NC (US)

Dhairya Kothari of Chicago IL (US)

Jason Rader of Jacksonville FL (US)

AUTOMATIC DATA OBFUSCATION - A simplified explanation of the abstract

This abstract first appeared for US patent application 18376717 titled 'AUTOMATIC DATA OBFUSCATION

Simplified Explanation

The method described in the patent application involves protecting user privacy online by creating an alternate persona based on artificial user data to obscure real user data. This is achieved by collecting user data, generating a user profile, and automatically performing online activities based on the alternate persona.

  • User data is collected via a computer-based user data agent.
  • A user profile is created based on the collected user data.
  • An alternate persona is generated with traits different from the user profile.
  • Online activities are automatically performed using the alternate persona to produce artificial user data.

Potential Applications

The technology could be applied in online privacy protection tools, social media platforms, and online advertising.

Problems Solved

This technology addresses the issue of protecting user privacy and data security online.

Benefits

The method provides users with a way to maintain their privacy and security while engaging in online activities.

Potential Commercial Applications

The technology could be utilized in privacy-focused software, data protection services, and online marketing tools.

Possible Prior Art

One possible prior art could be the use of anonymizing tools or services to protect user privacy online.

Unanswered Questions

1. How does the method ensure that the artificial user data accurately reflects the real user's online behavior? 2. What measures are in place to prevent the alternate persona from being linked back to the real user's identity?


Original Abstract Submitted

A method of protecting user privacy online includes collecting, via a computer-based user data agent, user data generated by a user on a user device and identifying a first set of traits associated with the user data. The method further includes generating a user profile based on the first set of traits associated with the user data and generating an alternate persona defined by a second set of traits that is different from the first set of traits. The method further includes automatically performing online activities based on the alternate persona to produce artificial user data that obscures real user data.