Many different types of mobile apps—including those related to productivity, education, lifestyle, social media, entertainment, and gaming—exist to provide users with ease of use, convenience, and functionality. These apps collect a vast amount of data for marketing purposes and data sharing with third parties. For example, location data, IP addresses, saved home or work addresses, and saved activity on websites and services can be collected on your device. Many apps allow advertising firms to track a user’s location, sell that information to others, and target advertisements based on a user’s location history. Users may have the option to adjust their security and privacy settings to reduce what and how information is collected and shared. However, these permissions may make it more difficult to determine what an app is doing with the accessibility of users’ data and location. The data may not be private or anonymized as expected and, therefore, may be vulnerable to malicious actors. Image Source: anupamdas.org The popular fitness app Strava tracks a user’s heart rate, activity details, GPS location, and more. Strava’s heatmap feature anonymously aggregates user activity to assist users with finding trails or exercise hotspots, meeting like-minded people, and conducting their workout sessions in more crowded and safer locations. However, in the above example, researchers discovered potential privacy concerns with Strava’s heatmap feature that could identify a user’s home address by tracking and de-anonymizing users utilizing the heatmap data and specific user metadata. They collected data through the Strava heatmap and used OpenStreetMap overlays and image analysis to detect start/stop routes next to streets, signifying that a specific home is associated with a user’s tracked activity. They also used Strava’s search feature to identify users who registered a specific city as their location, correlating high activity points on the heatmap and the user’s home address. The researchers noted that Strava users typically registered with real names and profile pictures, correlating identities with home addresses and voter registration data, if available online. Also, Strava accounts marked as “private” still display when searching for a list of all users in a specified municipality. Strava’s mitigations include starting the tracking after the user has left their home or creating an exclusion for the heatmap feature for a distance around home locations, which would provide an option for users to set privacy zones around their home locations, and/or opt out of the heatmap feature. Image Source: arxiv.org The Short Message Service Center (SMSC) of a mobile network handles SMS delivery reports and provides notifications when a message has been delivered, accepted, failed, is undeliverable, has expired, or has been rejected. Despite delays in this process, mobile networks’ fixed nature and specific physical characteristics can be predictable when standard signal pathways are followed. In the above example, researchers developed a machine learning algorithm to analyze timing data in the SMS responses to identify and extract the recipient’s location. First, measurement data was collected to correlate SMS delivery reports and the targets’ known locations. For every hour for three days, multiple SMS messages were sent to the target in the experiment as marketing messages to ignore or disregard, or as silent SMS messages displaying no content and producing no notification on the target’s screen. The timing of the SMS delivery reports was then measured and aggregated with matching location signatures to create a machine learning (ML) evaluation dataset. The ML model and training data included receiving location, connectivity conditions, network type, receiver distances, and more. Despite some limitations and extensive efforts, detecting information—such as a physical address, job location, phone number, email address, or other personal information—is still feasible, and users can easily become a target for cyberattacks, harassment, identity theft, and violence. Additionally, threat actors may engage in doxing, which is a tactic that involves the malicious targeting, compiling, and public release of personally identifiable information (PII) without permission. This information is posted on hosting websites and further disseminated on social media platforms. Doxing can also refer to revealing the real person behind an anonymous username and exposing their identity online. It is important to know how to stay secure and limit privacy concerns while using apps. Even if a user is careful with what information is voluntarily shared and what settings are adjusted, the app may be able to track activity without the user’s knowledge, and their data could still be at risk through other covert means. Beyond personal risk for individuals, businesses and organizations are advised to weigh the risks that apps introduce and consider restricting their usage in sensitive environments. It is vital to stay informed on the abilities, accesses, and permissions of apps, what data they collect, and what they do with that data. |