I recently noticed a Chrome extension I had installed—A simple utility—was loading Mixpanel into every page I visited—whether or not I was using the extension. Having published a popular Chrome extension in the past and receiving random solicitation from people offering to pay to track our traffic across the web, I know this is an industry.
This is an obvious threat to privacy and security, but also a threat to a pleasant browsing experience due to bogging down web pages. Both are terrible for the open web.
Take the time to be aware of the issues if you are not already paying attention.
Top comments (15)
Here's my list of plugins...
I've used Honey in the past, but I decided to remove it because of, again, the tracking. I don't shop online that often anyway. ;)
I wish the browsers had a warning/tracking service to indicate what type of traffic the plugins are generating.
Also be aware that a lot of the security restrictions on JavaScript just don't apply to plugins. They are given nearly free reign over the system (not really, but close enough).
I was using the Grammarly pluging briefly, until I realized it would send eveyrthing I typed, anywhere (not opt-in), to their servers. Scary.
Always use the network tab, it doesn’t provide a ton of insight but enough for you to understand something is happening
Yes, it's remarkably lax. I really hope it's an ecosystem that Google (and others, but mostly Google, honestly) are working hard to improve upon.
What extension did you create by the way?
I was using a page ruler extension a few months back and suddenly it started throwing the same Mixpanel error in all the pages probably because the extension became outdated. Then I realized what was happening and had to remove it.
As developers we have to use only the minimum required set of permissions for our extensions to work. Then we need to educate the users about the importance of the consent popup and what exactly each permission warning means. The most severe permissions are similar to giving the keys to your home, willingly. For optional features of your extension use optional permissions that the user can accept later manually, this drastically reduces the potential vectors of attack.
As consumers we live in a world where big companies own our entire online presence and we accept it because it's just how it is. But we mistakenly think that just about any third party provider should be able to "Read and modify all your data on all websites you visit", which is what most extensions require on install. Think about it: do you want some random developer to have read/write access to your entire online activity?
Even if the extension was not malicious initially, it can become one without you ever knowing, so please don't post a list of your extensions unless they have very limited permissions required.
I have a lot of chrome extensions (managed with SimpleExtManager), mostly disabled in cycles depending on what I'm doing. The API definitely lets you get some data about users with a simple permission popup, but there are limits to what extensions can do that were strict enough to make me give up on my first extension after realizing extensions were too sandboxed to implement it.
In my everyday browser I stick to a minimal extension set, Vimium, uBlock Origin and the Shodan plugin.
Years ago I had more but subsequent adblocker wars, malware injection scandals and commercial surveillance has made me cut it down to just these three.
Usually I can get at what I'd want from other extensions by examining scripts or HTTP headers in the terminal instead, commonly with curl, w3m or jq.
Extensions are essentially the browser version of "Run as Administrator...". It's awful. And while the permission system in most browsers are great, many extensions ask for sweeping access across the sites you browse. I only have uBlock Origin and Session Buddy
Facebook used to track your data and activities too. (It seems to be seen only on Androids, however, on iOS it seems everything OK):
Come here: facebook.com/settings
And at the bottom, you see a link to download all that you created on the FB from your first day with social networking.
Download .zip archive.
Unzip the archive and there is an HTML folder. Go there and find the contact_info.htm file
Click on it and you'll see a list of all your phone contacts whom you talked with (by phone, not via the Facebook application) and the timing.
These are the permissions you give FB usually when installing a Facebook application (the tick near 'access to contacts').
If you did not open the FB access to your contacts and phone calls, then the file will be empty. But that does not mean anything.
I have worked on the 1Password browser extension (one of the most widely used in the world, I would have to guess) for the last several years and keeping browser extensions light enough to not cause an unnecessary burden on users is one of the things we keep top of mind. We minify our code using Google Closure Compiler (for the desktop app’s extension) or Uglify (for the new 1Password X), and folks assume this is because we want to obfuscate some secret sauce in the code, but really, it’s about these kinds of good citizen goals. Recently, we were testing a build and the minifier has inadvertently been disabled for just one of our libraries and it ballooned the packaged file by multiple megabytes. When you start injecting this amount of script content into every document (in each iframe and some sites have as many as 100 on a page), size definitely matters.
I think the extension hosts could do a better job of disclosure though. It would not be hard to see that a script listed in the content scripts (or injected programmatically) is using one of these big ad frameworks’ scripts and alert the user or require the extension developer to indicate this clearly in their extension listing. Chrome’s store presentation is particularly well suited to this due to the distance between the form the developer fills in and the presentation of the form on the store.
It seems fitting that you wrote this today. I had a Chrome extension error this morning and when I looked into it, this is what I saw.
I use Brave on mobile all the time as they solve a lot of these problems, but on desktop, it's still not ideal as a browser for web dev even though it's an Electron based browser (they forked Electron to make it more secure). My main browser is still Chrome on desktop. I have extensions like uBlock and HTTPS Everywhere. I'll have brave running as well for work or personal email, but it's kind of hard to escape the richness of Chrome or FireFox for web dev.