You’ve probably heard of the genetic testing site, 23andMe. The site allows users to send in a swab covered in their saliva for genetic decoding. When that code is translated, it’s viewable online as a pie chart of ancestry. 23andMe even offers an API that allows you to share your genetic information with the REST of the world. Genetic information is some powerful stuff: It can countermand information that’s been passed down through a family, provide a clue to lost relatives, and even offer unexpected insights into one’s origins. But did you ever think that genetic information could be used as an access control? Stumbling around GitHub, I came across this bit of code: Genetic Access Control. Now, budding young racist coders can check out your 23andMe page before they allow you into their website! Seriously, this code uses the 23andMe API to pull genetic info, then runs access control on the user based on the results. Just why you decide not to let someone into your site is up to you, but it can be based on any aspect of the 23andMe API. This is literally the code to automate racism. The author offers up a number of possible uses, many of which sound fairly legitimate, however. Imagine a women’s support group online that restricts access to women only. What if JDate didn’t just take your word for it that you were Jewish, and actually checked your DNA to make sure?
Source: Using DNA for Access Control
Chris Dancy recently tweeted, “We don’t have a privacy problem with data we have a conveniency problem with data.” How true. We live in a day and age when we have become more desensitized to how our data is used to make our lives just a little more convenient. Earlier this week, Wired also published a great story on Disney’s MagicBand. Colleagues at work have described first hand how convenient the Disney MagicBand made their trip by allowing for things like unlocking your door at a Disney Resort hotel room, entering theme and water parks, checking in at FastPass+ entrances, connecting Disney PhotoPass images to your account and even charging food and merchandise purchases to your Disney Resort hotel room. Convenience?
It’s delightful, and it took hold faster than the goosebumps could. The utility seems so obvious, your consent has simply been assumed.
Source: Disney’s $1 Billion Bet on a Magical Wristband
The biggest risk to you and your company’s privacy is your smartphone.
David Perry writes on the dilemma with maintaining boundaries between work and private spaces on social media, and choices of not allowing colleagues into our private social space.
In a world in which virtual scholarly networks increasingly overlap with our personal virtual communities, we need to develop some clear standards with how we engage on social media with our colleagues, superiors, and subordinates. Here are my suggested rules:- Be aware of workplace hierarchies and your position in them.- You get to choose whether to “friend up” to people more powerful than you in the hierarchies.- You do not get to choose whether to “friend down” to your subordinates. They get to make that choice.- Either accept 100 percent of friend requests from subordinates or accept none. No middle ground.
Source: Should you Friend your Supervisor?
The most striking thing about the early Gmail patents is how exhaustive they were in attempting to anticipate every conceivable attribute of an email message that might one day be exploited for ad targeting purposes. In many cases it would be years before Google was actually able to make these ideas operational in Gmail. The first version of ad serving in Gmail exploited only concepts directly extracted from message texts and did little or no user profiling this method would only be put into practice much later. Some attributes have still not been implemented today and perhaps never will be. For example, as far as I know, Google does not reach into your PC’s file system to examine other files residing in the same directory as the file you attach to a Gmail message, even though the patents explicitly describe this possibility.
Source: Jeff Gould on The Natural History of Gmail Data Mining