With cameras now becoming more common place in people’s lives, the next step would be for people to take pictures of themselves in the same way they take pictures of friends and family. I think that would be a great idea – but only if it was easy for the average person to do.
I’m not sure that there’s anything wrong with taking a picture of yourself in a mirror, like on your lunch break or in the middle of the night. If you’re sitting at a desk, however, you may not have a mirror handy.
This isn’t the first time I’ve heard the idea that we should take better pictures. An article I read recently pointed out that the average American is about twice as likely to take a photo of themselves as the average European. So perhaps taking better pictures will help us become more like Europeans.
The theory behind this statement is that taking better photos will make us more like Europeans. So, if we take better pictures and we take better photos, theres a good chance we will be more like Europeans. Take a picture of yourself at the beach, for example. You’ll probably think you’re getting pretty close to Europeans by the time you get back home. But then when you go out to your favorite bar, theres a good chance you’ll look like a native. And so on.
The theory behind tech companies like Google and Intel (who have been working on phone cameras since the 1970s) is that by better capturing better photos we will become more like Europeans. This idea is based on the idea that if we take better photos then we will be more like Europeans. This theory goes back to the idea that for every action there is an equal and opposite reaction.
This theory has been an important part of Western culture since the dawn of photography, but the theory has been hard to test until recently. Now, in a new study paper, Google and Intel have developed a new set of algorithms that are able to use data from Google’s Street View cars to predict people’s ethnicity. The study shows that the accuracy of these new algorithms will go up in the years to come, leading to a new generation of “smart” people.
Basically, when you send your smartphone into the sky, you’re sending it into a virtual reality. And while that’s a pretty cool idea, I’ve never really understood why Google and Intel want to give us a VR headset.
Well, my first thought is that Google is probably just playing a long game of “will it be cool?” with Street View. I can see them doing this, because it is already cool. And also, I like the idea of people being able to walk around with their phone in their pocket and see what they look like.
Also, like most of the other cool ideas that Google has, including the Google Glass, the idea of VR as a way to see the world through your phone makes a lot of sense, but they should just stick with phones with sensors and cameras that you can actually see. Think about it like this, if you want a camera that you can see on a cellphone, you should get a camera with a lens that you can see through.
Well, yeah. But look how much more advanced it is now than it was a few years ago. We’re already seeing it in phones. You know, I can tell you from personal experience that the iPhone 4 has been a very good smartphone. But I was a bit skeptical of it because I thought it wasn’t going to get as good as the Galaxy S4. But then I took the plunge and got one.