Short Answer: We Don’t Think So

Privacy is a hot commodity in the current climate of technology and connectivity. It can be hard to balance out the need for security with the enticing functionality of latest apps and gadgets. When we factor in the additional need for public safety and effective law enforcement, it can feel like our privacy gets tossed around like a beach ball.

One of the ways that tech companies are working to ensure protection and privacy for their customers is by developing better security protocols, such as end-to-end encryption. This type of encryption, now being put into use for things like messaging apps, means that the text is encrypted when the user sends it, and then encrypted again when the recipient receives it. It also means the company who created the app can never see the content of the messages, share the messages, or have them fall into the wrong hands by being hacked.

One company in particular has taken the time to weigh the pros and cons and decided the best approach was to skip end-to-end encryption on one of its apps. Google’s new Allo messaging app was expected to employ this level of security, but the company has decided it is not compatible with its ongoing efforts at machine learning and artificial intelligence. This has some security experts and privacy advocates up in arms, as users can well imagine.

Yes, this decision does leave the door open for your messages to be nabbed in a data breach. It also means law enforcement can seek a warrant for those messages if they have reasonable proof that you’ve committed a crime and the content of those messages is involved. But for most app users, neither of those concerns register high on their list of priorities because they’re not sending sensitive information through the app.

The important take away is that Google did not make this decision through a lack of effort or through empty promises of security; the company isn’t leaving your texts vulnerable due to oversight or lack of protocols, as is too often the case in data breaches. This was an intentional decision because Google relies on user activity to “educate” its artificial intelligence lab and to help you with better autocorrect options, for example.

When a company makes a conscious decision about its security measures and then makes consumers aware of its decision and the reasons behind it, the company is being transparent. This empowers the consumer to make their own decisions based on the facts. Therefore, if users are concerned about the lack of end-to-end encryption, there are plenty of apps that do offer it. The more concerning security risk comes when a company assures its customers that their data is safe and locked up tight, then fails to put into place the adequate protections they promised. Basically, if you don’t like the methodology behind Allo, don’t use it. Make sure before you use any app that you understand its security and how it impacts your privacy.

Anyone can be a victim of identity theft, anyone can use our services, and anyone can help us help others. If you found this information useful, please consider donating to the Identity Theft Resource Center to help us keep our services free to the public.