WebProNews

Tag: Privacy

  • EFF Partners With DuckDuckGo, Adopts Its HTTPS Dataset

    EFF Partners With DuckDuckGo, Adopts Its HTTPS Dataset

    The Electronic Frontier Foundation (EFF) is partnering with DuckDuckGo to include the latter’s HTTPS dataset in its HTTPS Everywhere browser extension.

    The EFF and DuckDuckGo are closely aligned in their commitment to protecting user privacy. DuckDuckGo’s privacy browser extension for the desktop, and its standalone privacy browser for iOS, rely on the company’s Smarter Encryption technology.

    Smarter Encryption upgrades a standard unencrypted (HTTP) website connection to an encrypted (HTTPS) connection where possible. Smarter Encryption is more advanced than many competing options, since DuckDuckGo crawls and re-crawls the web to keep its dataset current.

    The EFF is now adopting DuckDuckGo’s Smart Encryption dataset for use in its own HTTPS Everywhere browser extension. Like Smart Encryption, HTTPS Everywhere is designed to help upgrade insecure connections. The EFF’s solution previously used “a crowd-sourced list of encrypted HTTPS versions of websites,” a less efficient and less comprehensive solution than DuckDuckGo’s.

    “DuckDuckGo Smarter Encryption has a list of millions of HTTPS-encrypted websites, generated by continually crawling the web instead of through crowdsourcing, which will give HTTPS Everywhere users more coverage for secure browsing,” said Alexis Hancock, EFF Director of Engineering and manager of HTTPS Everywhere and Certbot web encrypting projects. “We’re thrilled to be partnering with DuckDuckGo as we see HTTPS become the default protocol on the net and contemplate HTTPS Everywhere’s future.”

    “EFFs pioneering work with the HTTPS Everywhere extension took privacy protection in a new and needed direction, seamlessly upgrading people to secure website connections,” said Gabriel Weinberg, DuckDuckGo founder and CEO. “We’re delighted that EFF has now entrusted DuckDuckGo to power HTTPS Everywhere going forward, using our next generation Smarter Encryption dataset.”

  • Irish Commission Investigating Facebook Over Data Leak

    Irish Commission Investigating Facebook Over Data Leak

    The Irish Data Protection Commission (DPC) is investigating Facebook over the 533 million user records that appeared online.

    Earlier this month, user data for 533 million Facebook users was published online. The data included full names, Facebook IDs, birthdates, locations, past locations, bios and more. Facebook said the data was scraped due to a vulnerability that was fixed in 2019 — an explanation that provides little consolation to the users whose data was leaked.

    The DPC is now investigating Facebook, and believes the incident could violate the EU’s GDPR legislation.

    The Data Protection Commission (DPC) today launched an own-volition inquiry pursuant to section 110 of the Data Protection Act 2018 in relation to multiple international media reports, which highlighted that a collated dataset of Facebook user personal data had been made available on the internet. This dataset was reported to contain personal data relating to approximately 533 million Facebook users worldwide. The DPC engaged with Facebook Ireland in relation to this reported issue, raising queries in relation to GDPR compliance to which Facebook Ireland furnished a number of responses.

    The DPC, having considered the information provided by Facebook Ireland regarding this matter to date, is of the opinion that one or more provisions of the GDPR and/or the Data Protection Act 2018 may have been, and/or are being, infringed in relation to Facebook Users’ personal data.

    The DPC’s investigation is just the latest issue Facebook is facing, amid antitrust allegations and ongoing privacy concerns.

  • EU Set to Ban AI-Based Mass Surveillance

    EU Set to Ban AI-Based Mass Surveillance

    The European Union is preparing to pass rules that would ban AI-based mass surveillance, in the strongest repudiation of surveillance yet.

    According to Bloomberg, the EU is preparing to pass rules that would ban using AI for mass surveillance, as well as ranking social behavior. Companies that fail to abide by the new rules could face fines up to 4% of their global revenue.

    The rules are expected to tackle a number of major and controversial areas where privacy is concerned. AI systems that manipulate human behavior, or exploit information about individuals and groups, would be banned. The only exceptions would be some public security applications.

    Similarly, remote biometric ID systems in public places would require special authorization. Any AI applications considered ‘high-risk’ — such as ones that could discriminate or endanger people’s safety — would require inspections to ensure the training data sets are unbiased, and that the systems operate with the proper oversight.

    Most importantly, the rules will apply equally to companies based within the EU or abroad.

    The new rules could still change in the process of being passed into law but, as it stands now, the EU is clearly establishing itself as a protector of privacy where AI-based mass surveillance is concerned.

  • German Regulator Wants to Stop WhatsApp/Facebook Data Sharing

    German Regulator Wants to Stop WhatsApp/Facebook Data Sharing

    A German regulator is the latest to object to data being shared between WhatsApp and Facebook, and is taking steps to stop it.

    WhatsApp and Facebook drew worldwide ire when it was announced that WhatsApp user data would be shared with other Facebook-owned companies. The backlash was immediate, with Facebook initially delaying the move to give people time to adjust. Ultimately, however, the company is moving forward with its plans, and users will either have to accept the change or lose access to WhatsApp.

    A German regulator wants a third option, according to Bloomberg, with the regulator for Hamburg seeking an order that would block the data sharing. Given the data sharing is set to go into effect May 15, the regulator is seeking an order that would be “immediately enforceable.”

    “WhatsApp is now used by almost 60 million people in Germany and is by far the most widely used social media application, even ahead of Facebook,” Johannes Caspar, the data commissioner, said in a statement. “It is therefore all the more important to ensure that the high number of users, which makes the service attractive to many people, does not lead to an abusive exploitation of data power.”

    India has similarly taken steps to block data sharing between the services. With Germany now taking action as well, more jurisdictions may start taking a closer look and enacting privacy protection measures before it’s too late.

  • Signal Adding Privacy-Focused Cryptocurrency Payments

    Signal Adding Privacy-Focused Cryptocurrency Payments

    Signal messaging app is adding payments, using the MobileCoin cryptocurrency and wallet.

    Signal is widely considered to be the most private messaging platform available. It’s used by the US Senate, the EU Commission and various US military units. The platform provides end-to-end encryption, and has seen a major boost in popularity as a result of Facebook’s privacy blunder with WhatsApp.

    Signal is now looking to add payment processing, in a bid to better compete with WhatsApp, Apple iMessage and others. In keeping with its privacy roots, the company is integrating a privacy-focused cryptocurrency and wallet.

    Signal Payments makes it easy to link a MobileCoin wallet to Signal so you can start sending funds to friends and family, receive funds from them, keep track of your balance, and review your transaction history with a simple interface. As always, our goal is to keep your data in your hands rather than ours; MobileCoin’s design means Signal does not have access to your balance, full transaction history, or funds. You can also transfer your funds at any time if you want to switch to another app or service.

    The feature is currently in beta, and Signal actively wants feedbackfrom users.

  • Apple Rejecting Apps that Use Fingerprinting SDKs

    Apple Rejecting Apps that Use Fingerprinting SDKs

    Apple has begun rejecting apps that use software development kits (SDKs) that engage in fingerprinting.

    Fingerprinting is a method of collecting data and tracking users, creating a unique device fingerprint that can be tracked across services. As part of iOS 14’s improved privacy, Apple is now rejecting app submissions that use SDKs known to engage in this behavior.

    According to AppleInsider, a number of developers have already been notified of rejections.

    “Our app just got rejected by Apple’s app reviewer, blaming the MMP SDK for building a fingerprint ID,” wrote Heetch‘s Aude Boscher, in an industry Slack channel. “I saw other people complaining … so it might soon come up for you as well!”

    Apple’s notification message clearly says what the problem is:

    Your app uses algorithmically converted device and usage data to create a unique identifier in order to track the user. The device information collected by your app may include some of the following: NSLocaleAlternateQuotationBeginDelimiterKey, NSTimeZone, NSLocaleGroupingSeparator, NSLocaleDecimalSeparator …

    Adjust makes one of the offending SDKs, used by some 50,000 apps. The company has released an update that removes the offending code, however, providing a path forward for the apps using it.

    While the change is no doubt inconvenient for developers, kudos to Apple for cracking down on one of the more insidious methods of tracking users.

  • Hackers Access 150,000 Security Cameras: Tesla, Hospitals and Prisons Exposed

    Hackers Access 150,000 Security Cameras: Tesla, Hospitals and Prisons Exposed

    A groups of hackers has gained access to roughly 150,000 Verkada security cameras, exposing a slew of customer live feeds.

    Verkada is a Silicon Valley startup that specializes in security systems. The company’s cameras are used by a wide range of companies and organizations, including Tesla, police departments, hospitals, clinics, schools and prisons.

    The group responsible is an international collective of hackers. They claim to have hacked Verkada to shed light on how pervasive surveillance has become.

    In one of the videos, seen by Bloomberg, eight hospital staffers are seen tackling a man and restraining him. Other video feeds include women’s clinics, as well as psychiatric hospitals. What’s more, some of the feeds — including those of some hospitals — use facial recognition to identify and categorize people.

    The feeds from the Madison Country Jail in Huntsville, Alabama were particularly telling. Of the 330 cameras in the jail, some were “hidden inside vents, thermostats and defibrillators.”

    The entire case is disturbing on multiple fronts. It’s deeply concerning that a company specializing in security, and selling that security to other organizations, would suffer such a devastating breach. It’s equally concerning, however, to see the depth of surveillance being conducted, as well as the lengths being taken to hide the surveillance.

  • Android Phones Home 20x More Than iOS

    Android Phones Home 20x More Than iOS

    A computer researcher at Trinity College Dublin has released a report showing Android phones home to Google 20x more than iOS does to Apple.

    Apple and Google have fundamentally different approaches to data. Apple is a hardware and, increasingly, a software and services company. Unlike Google, however, Apple charges for the majority of its products and services. As a result, the company has repeatedly said it has no interest in consumer data, or viewing that data as the product.

    In contrast, Google offers much of its services completely free of charge. To make a profit, the company is primarily a data-driven company, where the customer — and their data — is Google’s primary product.

    Researcher Doug Leith shows how different the two companies’ approach is to how their phones transmit data, mirroring their approach to consumer data, according to Ars Technica.

    Where Android stands out, Leith said, is in the amount of data it collects. At startup, an Android device sends Google about 1MB of data, compared with iOS sending Apple around 42KB. When idle, Android sends roughly 1MB of data to Google every 12 hours, compared with iOS sending Apple about 52KB over the same period. In the US alone, Android collectively gathers about 1.3TB of data every 12 hours. During the same period, iOS collects about 5.8GB.

    Needless to say, Google has disputed the findings, with a spokesperson providing the following statement to Ars:

    We identified flaws in the researcher’s methodology for measuring data volume and disagree with the paper’s claims that an Android device shares 20 times more data than an iPhone. According to our research, these findings are off by an order of magnitude, and we shared our methodology concerns with the researcher before publication.

    This research largely outlines how smartphones work. Modern cars regularly send basic data about vehicle components, their safety status and service schedules to car manufacturers, and mobile phones work in very similar ways. This report details those communications, which help ensure that iOS or Android software is up to date, services are working as intended, and that the phone is secure and running efficiently.

    Despite Google’s protestations, Leith’s research is no surprise to anyone who has followed Google’s data-mining and collection practices.

  • Consumer Reports: Tesla’s In-Vehicle Cameras a Privacy Concern

    Consumer Reports: Tesla’s In-Vehicle Cameras a Privacy Concern

    Consumer Reports has raised concerns about Tesla’s in-vehicle cameras, saying they represent a privacy concern.

    Vehicles are increasingly moving toward automation, and a big part of that is cameras that monitor the driver. In many cases, these are to measure the driver’s response and ensure they are paying attention to the road.

    While several automakers include monitoring cameras, Tesla’s approach is much different than its competitors. According to Consumer Reports, BMW, Ford, GM and Subaru’s cameras are all close-circuit systems. The cameras are used exclusively in-vehicle, and do not record or transmit their footage.

    In contrast, Tesla has admitted that its cameras both record and transmit video to the company, which it then studies and analyzes to improve its self-driving technology.

    If drivers enable the cabin camera, Tesla says it will capture and share a video clip of the moments before a crash or automatic emergency braking (AEB) activation to help the automaker “develop future safety features and software enhancements,” according to Tesla’s website. Tesla did not respond to CR’s emailed request for additional information about its in-car monitoring systems.

    Tesla’s actions raise concerns about who benefits most from its monitoring systems, especially since the company has a habit of quickly blaming the driver when an accident occurs while the vehicle’s Autopilot is engaged.

    “We have already seen Tesla blaming the driver for not paying attention immediately after news reports of a crash while a driver is using Autopilot,” said Kelly Funkhouser, CR’s program manager for vehicle interface testing. “Now, Tesla can use video footage to prove that a driver is distracted rather than addressing the reasons why the driver wasn’t paying attention in the first place.”

    There’s also concern that Tesla’s system could be used in the future for some yet-to-be-disclosed purpose.

    Ultimately, the questions about Tesla’s in-vehicle monitoring system make a case for greater consumer protection — and buying a competitor’s offering.

    “Advanced features in cars can bring consumers enormous benefits, but it’s important for our laws to make sure that automakers put people ahead of their bottom line. Automotive innovation must come hand-in-hand with strong and sensible consumer protections,” says William Wallace, manager of safety policy at CR.

  • Mozilla Improves Privacy by Trimming HTTP Referrer

    Mozilla Improves Privacy by Trimming HTTP Referrer

    Mozilla has announced a significant change to how Firefox handles HTTP Referrers, in an effort to improve user privacy.

    The HTTP Referrer is header information browsers send to the current website, informing it what website “referred” it. In other words, the current website knows the last website the browser came from.

    In many cases, the referrer information is used in harmless ways, but it can be abused to gain access to private information. Because the referrer information includes the specific page a person was previously looking at, in can help a website better understand a visitor’s interests. It can also include a user’s account information from the website they came from.

    Mozilla is now trimming the referrer information in an effort to better protect user privacy.

    Starting with Firefox 87, we set the default Referrer Policy to ‘strict-origin-when-cross-origin’ which will trim user sensitive information accessible in the URL. As illustrated in the example above, this new stricter referrer policy will not only trim information for requests going from HTTPS to HTTP, but will also trim path and query information for all cross-origin requests. With that update Firefox will apply the new default Referrer Policy to all navigational requests, redirected requests, and subresource (image, style, script) requests, thereby providing a significantly more private browsing experience.

    Mozilla’s announcement is a welcome one, as the company continues to be a leading advocate for user privacy.

  • France Upholds Apple’s Privacy Changes

    France Upholds Apple’s Privacy Changes

    In a win for Apple and privacy advocates, the French Competition Authority has upheld Apple’s right to proceed with its iOS privacy changes.

    Apple has been warning developers since last year of upcoming privacy changes to iOS that would prevent apps from tracking users without their permission. Apps are also required to include a privacy label that outlines exactly what data they collect.

    Needless to say, the advertising industry has been up in arms over the changes, clinging to the archaic belief they should have the right to collect detailed, personal data and track users across services and devices, without their knowledge or consent. As a result, the advertising industry is trying fighting on multiple fronts to force Apple to back down.

    According to Fortune, the French Competition Authority said Apple’s plans did not appear to be abusive, since “a company, even if it is in a dominant position…has the freedom in principle to set rules to access its services, subject to not disregarding the laws and applicable regulations and that these rules are not anticompetitive.”

    The Competition Authority said it would continue to investigate to make sure Apple is playing by its own rules, and not gathering and tracking more data than it allows third-party developers to track.

    “We’re grateful to the French Competition Authority for recognizing that App Tracking Transparency in iOS 14 is in the best interest of French iOS users,” Apple said in a statement.

  • TikTok Draws Scrutiny and Warning From EU

    TikTok Draws Scrutiny and Warning From EU

    TikTok is once again under scrutiny for its data practices, with the EU warning that some data may be making its way to China.

    TikTok claims that EU user data is sent to the US, not China. But according to the EU, some of that data may be accessible to engineers based in China, reports Bloomberg.

    “TikTok tells us that EU data is transferred to the U.S. and not to China, however we have understood that there is possibility that maintenance and AI engineers in China may be accessing data,” said Helen Dixon, the Irish Data Protection Commissioner.

    The claim is the latest in a long string of privacy issues the social media company has faced. The most recent saw the company settle a lawsuit for some $92 million. TikTok’s privacy practices also led the Trump administration to try to ban the app, although it’s unclear if the Biden administration will continue pursuing those efforts.

  • T-Mobile’s Privacy-Threatening Ads Are Decidedly ‘Carrier’

    T-Mobile’s Privacy-Threatening Ads Are Decidedly ‘Carrier’

    T-Mobile prides itself on being the “Un-carrier,” but its latest advertising move is decidedly “Carrier” and threatens its users’ privacy.

    T-Mobile’s turnaround has been so successful that it will be studied in business school for years to come. Once the fourth-largest carrier, and facing major challenges, the company moved into second place after surpassing Sprint for third and then buying them out. T-Mobile now finds itself as a leader in 5G and the company to beat in the wireless industry.

    Much of that success stems from its Un-carrier status, with an emphasis on giving customers what they want. Unlimited data, taxes and fees included in the final price, international texting and data, as well as free calling to and from Canada and Mexico are just a few of the features the magenta carrier pioneered or reintroduced to the market.

    The company’s customer-focused approach makes its latest decision all the more difficult to understand, as it is automatically opting customers into targeted advertising that will use their data.

    Under T-Mobile’s personalized ads program, we use and analyze data from things like device and network diagnostic information (Android users only), apps on your device, and broadband information. This data helps us understand more about user interests (e.g., sports enthusiast, loves cooking, etc.). Using this information, we create groups known as “audience segments,” which may be used by T-Mobile or sold to third parties to make ads more relevant to you. When we sell audience segments, we do not sell information that directly identifies customers, like name, address, or email. Rather, audience segments are associated with mobile advertising IDs, which are long set of numbers and letters. For example, this might say something like “2drdn43np2cMapen084″ is a sports enthusiast.” Take a look at our Advertising and Analytics article and T-Mobile privacy policy for details.

    A spokeswoman told The Wall Street Journal that the company had “heard many say they prefer more relevant ads so we’re defaulting to this setting.”

    The company claims that the information is not identifiable and can’t be linked to a specific user. Unfortunately, that claim doesn’t even begin to hold water.

    “It’s hard to say with a straight face, ‘We’re not going to share your name with it,’ ” Aaron Mackey, a lawyer for the San Francisco-based Electronic Frontier Foundation, told the WSJ. “This type of data is very personal and revealing, and it’s trivial to link that deidentified info back to you.”

    While Verizon and AT&T both sell customer data to advertisers, they both take the extra step of pooling the data together to make it much more difficult, if not impossible, to identify specific profiles. Both companies also have more detailed targeted ad programs, like T-Mobile’s, that share far more personal data. However, these programs are opt-in programs— not on by default like T-Mobile’s.

    Fortunately, it’s relatively easy to opt-out of T-Mobile’s targeted ads. Simply go to T-Mobile.com, click on Account > Profiles > Privacy and Notifications > Advertising & Analytics and toggle “Use my data to make ads more relevant to me” to “Off.”

    While it may be easy to turn the feature off, that doesn’t change the fact it should never have been an opt-out proposition. It’s one thing for free services, such as Facebook and Google, to make money off of targeted ads that use personal data and infringe on privacy, but it’s quite another for a paid service to presume to do the same. For a company that prides itself on protecting the consumer to do so…well, that’s just unconscionable.

    T-Mobile’s actions in this instance are more “Carrier” than the two wireless carriers it constantly mocks.

  • Google Wants a More Private Web, Will Not Build ‘Alternate Identifiers’ to Replace Cookies

    Google Wants a More Private Web, Will Not Build ‘Alternate Identifiers’ to Replace Cookies

    Google has announced it has no intention to build or use “alternate identifiers” as a replacement to cookies for tracking individuals.

    Google stunned the industry when it announced it would remove support for third-party cookies in Chrome, which currently has roughly 70% of the web browser market. While useful for providing site functionality, cookies are often used to track individuals across websites and build a startlingly complete picture of a person’s interests and browsing habits.

    Some had thought Google might develop alternative identifier solutions to replace cookies, but the company has firmly shot that idea down. David Temkin, Director of Product Management, Ads Privacy and Trust, outlined the company’s plans in a blog post:

    That’s why last year Chrome announced its intent to remove support for third-party cookies, and why we’ve been working with the broader industry on the Privacy Sandbox to build innovations that protect anonymity while still delivering results for advertisers and publishers. Even so, we continue to get questions about whether Google will join others in the ad tech industry who plan to replace third-party cookies with alternative user-level identifiers. Today, we’re making explicit that once third-party cookies are phased out, we will not build alternate identifiers to track individuals as they browse across the web, nor will we use them in our products.

    Temkin reiterated the company’s commitment to its Federated Learning of Cohorts (FLoC) API. FLoC is designed to hide an individual in the crowd, essentially providing privacy through obscurity. Some are not convinced, however, with the EFF labeling FLoC “a terrible idea.”

    Still, given Google’s history of ignoring and abusing individuals’ privacy, a history that has resulted in lawsuits, its refreshing to see the company take at least some stand for privacy.

    Keeping the internet open and accessible for everyone requires all of us to do more to protect privacy — and that means an end to not only third-party cookies, but also any technology used for tracking individual people as they browse the web. We remain committed to preserving a vibrant and open ecosystem where people can access a broad range of ad-supported content with confidence that their privacy and choices are respected. We look forward to working with others in the industry on the path forward.

  • Brave Launching Privacy-Focused Brave Search

    Brave Launching Privacy-Focused Brave Search

    Brave, the privacy-focused web browser made by JavaScript creator Brendan Eich, is throwing its hat in the search engine ring.

    Brave has made a name for itself as one of the best web browsers for an out-of-the-box privacy focus, aggressively blocking trackers and ads. The browser uses Chromium as its rendering engine, ensuring its high performance and compatibility. Brave also includes its own cryptocurrency, which can be used as a way of rewarding content makers, in an effort to reinvent how paid web content works.

    The company’s latest effort is its most ambitious yet, with plans to take on Google with a more privacy-focused alternative — Brave Search.

    Billed as “search without a trace,” Brave Search will respect privacy, not harvesting user data, tracking or profiling users, or being beholden to advertisers. The search engine will offer both ad-free paid search and ad-supported free search options.

    The most critical basis of a search engine is its index of the web. To make Brave Search a reality, the company acquired Tailcat, an open search engine developed by the same team responsible for German search engine Cliqz, a Hubert Burda Media holding. Tailcat will form the basis of the new Brave Search.

    “Brave has grown significantly over the past year, from 11 million monthly active users to over 25 million. We expect to see even greater demand for Brave in 2021 as more and more users demand real privacy solutions to escape Big Tech’s invasive practices,” said Brendan Eich, CEO and co-founder of Brave Software. “Brave’s mission is to put the user first, and integrating privacy-preserving search into our platform is a necessary step to ensure that user privacy is not plundered to fuel the surveillance economy.”

    “We are very happy that our technology is being used at Brave and that, as a result, a genuine, privacy-friendly alternative to Google is being created in the core web functions of browsing and searching,” added Paul-Bernhard Kallen, CEO of Hubert Burda Media. “As a Brave stakeholder we will continue to be involved in this exciting project.”

    “The only way to counter Big Tech with its bad habit of collecting personal data is to develop a robust, independent, and privacy-preserving search engine that delivers the quality users have come to expect. People should not be forced to choose between privacy and quality,” said Dr. Josep M. Pujol, head of the Tailcat project. “The team is excited to be working on the only real private search/browser alternative to Big Tech available on the market.”

    With Google Chrome and Google Search boasting a 70% and 92% share of their respective markets, Brave definitely has an uphill battle ahead of it. Nonetheless, the company has gained significant momentum over the last couple of years. In addition, Google’s antitrust troubles have opened the door to what may be the best opportunity to challenge the once unassailable market leader.

    In the meantime, interested users can sign up to be put on a waiting list for early access to Brave Search.

  • Judge ‘Disturbed’ by Google’s Data Tracking

    Judge ‘Disturbed’ by Google’s Data Tracking

    U.S. District Judge Lucy Koh has expressed she is “disturbed” by accusations regarding the depth of Google’s data tracking habits.

    Google is facing a class-action lawsuit accusing the company of lying to its customers when it says it doesn’t track them in Chrome’s Incognito Mode. When Incognito Mode is active, the browser is not supposed to remember browsing history, filled out form data, cookies, site data and more.

    The lawsuit alleges that Google is leveraging code in its analytics platform — which is used on countless websites — to bypass Incognito Mode. This allegedly gives Google the ability to scrape data to piece together a profile of users’ browsing and habits.

    According to Bloomberg, Judge Koh was “disturbed” by the accusations. When Google tried to have the case dismissed, Judge Koh said it was “unusual” that Google would go to the “extra effort” to collect the data in question, unless it was using it to do the very thing Incognito is supposed to prevent.

    Google is facing multiple lawsuits, both for its privacy practices and for alleged anticompetitive behavior. A judge finding the company’s actions ‘distrusting’ is not a good look for Google.

  • TikTok Settles Privacy Suit For $92 Million

    TikTok Settles Privacy Suit For $92 Million

    TikTok has agreed to pay $92 million to settle a lawsuit in the US over its privacy practices.

    TikTok quickly rose to be one of the most popular social media platforms in the world, and was the first Chinese app to gain its level of worldwide success. With that success, however, came greater a degree of scrutiny. The app has repeatedly been accused of violating child privacy, uploading videos to China without user consent and being a threat to national security.

    The company has settled a lawsuit filed by TikTok users in the U.S. District Court in Illinois. Illinois has already established itself as a privacy haven, with Facebook recently settling a lawsuit filed against it in the state.

    “While we disagree with the assertions, rather than go through lengthy litigation, we’d like to focus our efforts on building a safe and joyful experience for the TikTok community,” TikTok said Thursday, according to NBC News.

    The settlement requires court approval.

  • Firefox Introduces Total Cookie Protection

    Firefox Introduces Total Cookie Protection

    The latest release of Mozilla’s Firefox includes a significant privacy upgrade, introducing Total Cookie Protection.

    Cookies are snippets of code that websites use to identify users. They are responsible for a number of useful features, such as the ability to revisit a site and access personalized information without needing to log in again. Cookies can also be used to track users, however, including by companies that use them to track users across other websites.

    In 2019, Firefox introduced Enhanced Tracking Protection (ETP), which blocks cookies from known trackers. Today’s announcement takes it a step further with Total Cookie Protection.

    Our new feature, Total Cookie Protection, works by maintaining a separate “cookie jar” for each website you visit. Any time a website, or third-party content embedded in a website, deposits a cookie in your browser, that cookie is confined to the cookie jar assigned to that website, such that it is not allowed to be shared with any other website.

    This is an important feature that will go a long way toward protecting user privacy and ensuring cookies aren’t abused as a way of tracking users.

  • TikTok Runs Afoul of European Consumer Law

    TikTok Runs Afoul of European Consumer Law

    TikTok has found itself in trouble with the European Consumer Organisation BEUC, as a result of multiple infractions against EU consumer laws.

    TikTok has faced repeated criticism for breaches of privacy, especially in regard to the privacy of minor children. The company has also faced ongoing criticism, scrutiny and lawsuits over its privacy practices in general.

    The latest troubles come from the EU, as TikTok is accused of violating several EU consumer laws, according to the BEUC, including a failure to properly protect children.

    The BEUC’s investigation found that a number of TikTok’s ‘Terms of Service’ were unfair, ambiguous and favoring TikTok to users’ detriment. The same is true of the company’s copyright terms, which give TikTok “an irrevocable right to use, distribute and reproduce the videos published by users, without remuneration.”

    The BEUC took issue with how TikTok administers the coins people can purchase to reward their favorite content creators, retaining too much control over exchange rates.

    The company’s handling of user data is also misleading, with TikTok not properly informing users — especially children — of how and why their data is being collected and how it’s being used.

    Most egregiously, the company is failing to protect children and minors from potentially harmful content and and hidden advertising.

    The BEUC wants “authorities to launch a comprehensive investigation into TikTok’s policies and practices and to ensure that TikTok respects EU consumer rights. The company should properly inform consumers about its business model and data processing activities and stop imposing unfair terms and practices on its users. TikTok should also stop keeping its users in the dark about the financial consequences of buying virtual gifts for their favourite idols and improve the fairness of this service. In particular children and teenagers, who form an important part of Tik Tok’s audience must be adequately protected regarding their exposure to marketing, hidden advertising and inappropriate content.”

    Given the EU’s strong privacy and consumer legislation, TikTok’s run of fast and loose privacy practices is likely coming to an end.

  • iOS 14.5 Safari Will Mask IP Address From Google

    iOS 14.5 Safari Will Mask IP Address From Google

    Apple is ramping up its efforts to protect user privacy, including a feature in iOS 14.5 that will mask IP addresses in Safari.

    Apple has been making significant changes to iOS and iPadOS, forcing app makers to include privacy labels to disclose what information they track. The company is also preparing to include a feature that will force apps to ask for permission to track users, rather than doing it automatically.

    Now the company is working on its next big privacy upgrade, masking Safari’s IP address. Companies can use a device’s IP address to help build a profile of the individual’s browsing habits. Given that Apple’s default search engine is Google, this is a real concern.

    First noticed by a Reddit user and reported by The 8-Bit, the feature is enabled when Safe Browsing is active. Safe Browsing is activated by turning on the “Fraudulent Website Warning” in Settings > Safari.

    Maciej Stachowiak, Apple’s Head of Webkit Engineering, provided a bit of additional detail about how the feature will work.

    As Stachowiak explains, iOS 14.5 Safari will re-route traffic through an Apple proxy service to hide IP addresses. This should provide a significant level of additional privacy to iOS and iPadOS users.

  • Virginia Following California’s Example With Privacy Law

    Virginia Following California’s Example With Privacy Law

    Virginia is poised to join California in enacting comprehensive privacy legislation to protect its citizens.

    Unlike the EU, the US does not have national privacy legislation. As a result, California was the first state to pass such legislation to protect its own citizens. The California Consumer Privacy Act (CCPA) went into effect on January 1, 2020. An updated California Privacy Rights Act (CPRA) was approved by voters on November 3, 2020 and goes into effect January 1, 2023. The CPRA builds on the CCPA, adding additional protections.

    Virginia is now on the verge of passing its own privacy legislation, according to Reuters. The Virginia Senate has passed a version of the bill, following the Virginia House’s passage of its own bill a week earlier. The next step is for legislators to reconcile the two bills and pass the reconciled version, which shouldn’t pose a problem since the two bills are almost identical. Once the governor signs the bill into law, it will go into effect January 1, 2023.

    Another state privacy law will further complicate things for companies that will have to abide by multiple state laws. Some companies were already applying the CCPA to all US customers and may decide to do the same with Virginia’s law, should it go into effect.

    Either way, if Virginia passes its own privacy legislation, it will increase pressure on the US government to pass comprehensive federal privacy legislation.