A few question and a thought

Written on 10 December 2014, 10:50pm

Tagged with: , ,

iPhone sensors

1. proximity sensor (to turn off the screen when you talk)
2. ambient light sensor(to dim the screen on low light)
3. barometer (to determine pressure/altitude; starting iPhone 6)
4. accelerometer (measures acceleration; analyze the direction in which the device is moving)
5. gyroscope (measure rotation/orientation)
6. magnetometer (measure the strength or direction of magnetic fields)
In addition to that there is the GPS signal receiver. Read more about
the differences between accelero/gyro/magneto sensors.

That being said, I would like to know:
1. Which sensors are turned off in airplane mode (accelero/gyro/magneto/GPS)
Apparently only the GPS. Even though even that is not really necessary, since it only receives signal. In fact, some Android-powered phones do not turn off GPS in airplane mode.
Note: Apple says that airplane mode disables cellular, wi-fi, bluetooth, GPS and location services. Saying that airplane mode disables location services is a bit redundant since the location services uses a combination of cellular, Wi-Fi, Bluetooth, and GPS to determine your location.

2. which sensors are used by the compass app (app partially works in airplane mode)
According to my tests (so based on empirical data 🙂 ) – the compass app works with a combination of 3 (accelero/gyro/magneto) sensors available in airplane mode. If GPS is also available, then the latitude/longitude are also displayed.

3. which sensors are used by Room Scan app (app works fine in airplane mode)
Again, based on empirical data – a combination of accelero/gyro/magneto. No GPS use.

I did not find the answer yet. Keep digging.

Can machines determine what’s beautiful?

Services like Foap or EyeEm try to sell your casual smartphone pictures. Not a bad idea, considering the huge number of ‘mobile’ photos and the success of instagram.
But what I find a bit over the line is this:

Now that the company has the layer of machine learning up and running (and learning new concepts every day), EyeEm is “training” its algorithms to identify which photos actually look good. By looking at things like which objects are in focus and blurred, what’s located at each third of a photo, and other identifiers of “beauty,” the ranking algorithms determine an EyeRank of aesthetic quality for each photo and applies an aggregated score to each photographer.
TechCrunch

Because sure, let’s leave the machines tell us what’s beautiful. Humans are not good at it anymore. Makes sense.
On a completely unrelated news, some smart people see the threat in the Artificial Intelligence and forecast that the development of full artificial intelligence could spell the end of the human race.

moon steps
https://www.foap.com/users/dorinmoise

Leave a response