Monday, 31 August 2015

Galaxy Note 5 review: Continuing Samsung's reign as the King of Phablets

If I could credit Samsung with anything, it's making big phones into a fad for the rest of us. The company basically birthed the idea of a phone-tablet hybrid, and if it wasn't for the Note series, Apple might not have even considered bumping the iPhone up to 5.5 inches. (You're welcome, iPhone users.)

With that said, Samsung continues to improve the Note series with every new version. The Galaxy Note 5 is no different: it boasts a beautiful screen, powerful innards -- including an extra gigabyte of RAM -- and suite of helpful productivity apps to accompany its super precise stylus, the S Pen. If you haven't considered a mega-sized Android phone until now, the Galaxy Note 5 is the best place to start. It's so good at everything it does, it outshines the rest of the Galaxy family of smartphones.
[ Also InfoWorld: Find out which is the best browser for Android smartphones. | Get a digest of the day's top tech stories in the InfoWorld Daily newsletter. ]

A big body, a big screen

Florence Ion
The Galaxy Note 5 is just as sleek as its predecessors, despite its larger size. 
The Galaxy Note 5 looks like a bigger, more professional version of the Galaxy S6. It has the same metal and glass construction, rounded edges, and the barely-there bezel on the sides of the display. But it's slightly denser and a little heavier to hold, which is especially apparent when you're actually talking on the phone. As for the S Pen, it's tucked away inconspicuously on the bottom near the charging port. When it's docked, it just looks like an extra button.
Florence Ion
The S Pen resides in that small hold on the bottom of the Note 5.
Samsung slimmed down the Galaxy Note 5 enough so that even smaller hands like mine can have an easier time grasping it. It's still a pretty big phone, though, so don't expect something that easily fits in your pocket.
Florence Ion
I don't feel like I have the tiniest hands in the world with the Galaxy Note 5. 
Samsung stuck with its usual button layout: a physical Home button with a fingerprint sensor built in, two capacitive navigation buttons on either side ("back" on the right and "recent apps" on the left), a volume rocker on the upper left side, and a power button on the right. If you've been a Samsung user for a while, you know this routine already. But if you're a newbie, you'll have a whopper of a time getting used to the Note 5's reversed navigation buttons. I wish Samsung would change the placement of those buttons, or at least leave it up to the users to choose for themselves.
Florence Ion
The Note 5's Quad HD Super AMOLED display is just vibrant as its siblings, and even brighter. Because of the screen's larger size, you can actually share a video with a friend without worrying that they can't see much. My fiancé and I spent time going through our YouTube subscriptions on the Note 5 and we were pretty comfortable watching it together on its 5.7-inch display. The only set back was that I could only hold the phone a certain way, lest I was muffling the tiny speakers on the bottom side.
The larger screen size is also great for productivity. I have a better grip for thumb typing, so emails can be penned more quickly. Editing and cropping photos and videos is much easier, and reading an ebook or digital magazine is possible without bumping up the font size. As an added bonus, Samsung optimized the display so that it doesn't use up as much energy at a given brightness level. So, if you like to read on your phone, have at it! The Note 5 can handle it.

Phenomenal performance

Jason Cross
The Galaxy Note 5 smashed its precedessor in our performance benchmarks, though it's quite on par with this year's smaller-sized flagships. 

The US-variant of the Galaxy Note 4 came with a Snapdragon 805 processor, which was a good chip, but not as fast as Samsung's latest Exynos. The Galaxy Note 5 runs the same 64-bit Exynos 7 Octa 7420 processor as the Galaxy S6 and S6 Edge. It's fast, it's furious, and it's also energy efficient. I put all the benchmarks into their own article so you can see an in-depth rundown of the Note 5, including whether its 4GB of RAM actually helps performance. (Hint: it does.) Just be warned that this phone gets almost scalding on hot days -- I was not thrilled benchmarking it in 109-degree California heat, and had to stop for fear of burning my hands.
Jason Cross
but I want to reiterate that my day-to-day experience with the Galaxy Note 5 mirrors what the chart shows above. The Galaxy Note 5 lasts a really long time with varying usage, and that's with location, Wi-Fi, and data on. Granted, if you're using turn-by-turn directions or playing a game, you're going to use up juice much more quickly than you would leaving the phone in your bag with the display off. But I like that I can do the latter without worrying that the phone is dying just by being on.

One of the best cameras on the market

Florence Ion
The Note 5's rear-facing 16-megapixel sensor is one of the best on the market. 
Like the Galaxy S6 and S6 Edge, the Note 5 is an Android phone with a camera that you can rely on in any situation. It features the same camera sensors, including a 16-megapixel rear-facing camera, and a 5-megapixel front facing one.
·        
·        

This puppy was too cute not to snap a photo -- with my finger in the corner! 

The Galaxy Note 5 also performed on par with its predecessors in our lab camera tests. 

The Note 5 has particularly wonderful low-llight performance, made better by the fact that the phone now features manual shutter speed controls, up to 10 seconds.

And unlike the last generation Note 4, the Note 5 won't drown out your subjects with the rear-facing flash.
Samsung didn't just copy and paste the camera sensor from the Galaxy S6 and leave it at that; it tossed in some neat new tricks in the camera app, including the ability to select the shutter speed. This can lead to professional-looking night time shots, though you're limited to a 10-second exposure.

The stars weren't out, but I was able to get some of that night time atmosphere light with a 10-second exposure. You can see the indoor light lighting up the top of the umbrella in the left-hand corner. Typically, this picture would have come out pitch black.
I went more in-depth with the new camera features in the Note 5 here, including its YouTube streaming functionality and all of its new manual settings. It's convenient having the features you'd use with a third-party app baked right into the native camera app.

Can your Palm Pilot do this?

I'm going to skip over Touchwiz here because it's exactly the same as what's on the Galaxy S6: it's lighter, it's blue, and it comes preloaded with a suite of Microsoft apps that you can't delete. You'll notice in my photos that I loaded the Google Launcher on my review unit after a while because I like having Google Now permanently affixed to the left of the Home screen. I still have to contend with the garish Quick Settings, but it's not that bad. Also, Samsung's been much better about software updates in the last year, so you should be fine with timely Android updates (if your carrier doesn't hold you back). 
Florence Ion
The new S Pen feels more like a real pen.
I've always believed that it's the included stylus that makes the Note series worthwhile. Samsung updated the S Pen with a clicky top and a nib that looks more like a ballpoint pen. It's also a bit more dense, so it feels balanced when you hold it. I kept accidentally putting the S Pen away in my pencil pouch, thinking it was an actual pen.
Florence Ion
Air Command now takes up the entire screen. 
Of course, the real benefit of the S Pen is its accompanying software. Samsung overhauled the Air Command screen so that it's an entire page of icons, rather then just a pop-up overlay like on the Note 3 and Note 4. The usual suspects are still there: Action memo, Smart select, Screen write, and S note. You can also add two of your own shortcuts for any third-party apps that take advantage of the S Pen. 

I'll be honest: I mostly used the S Pen to collage makeup and clothing I want to buy. Whatever!
The S Pen-specific apps have been polished up a bit, too. Now when you write on screen, you'll hear a cute swishy sound that's supposed to mimic the sound of a pen on paper. There's also the ability to pop out the pen and start writing on the Lock screen, or you can capture a entire webpage in any browser app with Scroll capture. I went more in-depth with these features here. Samsung also bundled in the ability to do PDF annotations on the fly, which are way easier to do with the S Pen than with just your finger. I edited a letter for my Mom recently and didn't immediately feel the need to run to my computer to take care of edits.

Audio tricks that aren't gimmicky

Florence Ion
Samsung got a little "jealous" of its competitors' audio-enhancing capabilities, so it made its own.
Samsung's been pretty consistent about delivering powerful phones with fantastic displays, but you could accuse it of leaving sound quality behind. It changed its tune this year -- pun intended -- by introducing a suite of sound quality enhancement features to the Galaxy S6 and S6 Edge, including Adapt Sound, which calibrates your headphones, and SoundAlive+, which helps recreate the effect of surround sound even when it's not present. These features work pretty well, though they're not as significant sounding as what HTC offers with BoomSound, for instance.
Samsung features software that calibrates the audio for you to make it sound fuller, richer.
The Galaxy Note 5 brings with it a feature called Ultra High Quality Audio (UHQA), which helps "enhance the sound resolution of music and videos." In practice, it seems to just enhance the bass of whatever you're listening to, and it's not entirely apparent unless you download music or videos directly to your device. I tried it out with Chromeo's latest album and, like I originally described in my hands-on, it just sounds like someone finally wired the smartphone "stereo" correctly. I'm bummed it doesn't currently work with streaming apps like Spotify and Digitally Imported, though there is some more third-party support coming soon. Out of the box, Pandora and YouTube are the only apps that take advantage of this new sound-enhancing feature.

Pay with your phone -- just not yet

I wanted to mention very briefly that while the Galaxy Note 5 currently support wireless payments with NFC via Google Wallet, the Samsung Pay feature is not live yet. It'll be in beta this month, and is scheduled to launch in September. When that happens, I'll be taking it out for a test drive in the real world. Stay tuned.

The King of Phablets

Florence Ion
Your new King of Phablets.
Samsung's phone-tablet hybrid device no longer feels like it's been made to cater to an elite group of professional smartphone users. It's for both the business-centric user and the creatives types who want to doodle and dawdle all day with their smartphone in hand. 
I enjoyed the last two generations of Samsung's Note phablet and I gave them both high scores, but I'm giving the Galaxy Note 5 a slightly higher score because it's absolutely everything you want out of a smartphone: a fantastic camera, a productivity device, a sketchbook, a digital scrapbook, a boom box, and a portable gaming console. If you're going to spend gobs of money for the most premium smartphone out there, it's gotta be completely worth your while, and the Galaxy Note 5 is totally worth it. If Samsung would only cut back on the heavy-handed TouchWiz interface changes and bloatware, it could easily score 5 stars.


The 5 mobile technologies to watch in 2014

64-bit apps, motion coprocessors, iBeacons, Miracast, and MBaaS all could be on the brink of achieving great things.

Take a step back from your iPad, iPhone, Galaxy, or whatever for a moment. What you hold in your hand today should undergo serious improvements in 2014, given the groundwork laid in 2013. For some people, taking advantage of those improvements will mean getting new devices, but many current device owners -- especially those who bought Apple's latest models -- will access them in what they already own.

1. 64-bit apps

iOS 7 debuted with the 64-bit Apple A7 processor in theiPhone 5siPad Air, and iPad Mini with Retina display. Apple'sXcode 5 IDE allows creation of 64-bit apps from existing code, so the iOS world will see 64-bit apps become common in 2014. As with the transition to 64-bit apps in Mac OS X Snow Leopard, most apps won't really take advantage of the greater processing and memory capabilities in their first 64-bit versions, both because developers won't have figured out how to get the maximum effect in the first go-round, and because they won't want the 32-bit versions of their apps used on older devices to be radically inferior until enough of the market has 64-bit devices.

[ Bob Violino and Robert Scheier show how businesses today are successfully taking advantage of mobile tech, in InfoWorld's Mobile Enablement Digital Spotlight PDF special report. | The best office productivity tools for the iPad andfor Android tablets. | Keep up on key mobile developments and insights with the Mobilize newsletter. ]
After Apple debuted the A7 in September, several Android smartphone makers said they too would ship 64-bit devices, likely using a recent ARM reference design. But that won't do much for them until Google has a 64-bit version of Android to run on it. Expect that in the second half of 2014, giving iOS nearly a year's lead time in 64-bitness.
2. Spatial sensitivity

Motorola Mobility debuted its X8 motion coprocessor in the Moto X this year, and Apple followed up with its own M8 motion coprocessor in the iPhone 5s, iPad Air, and Retina iPad Mini. You won't find these coprocessors in a PC, which tend not to be used in motion or with devices that move. But smartphones and tablets are used on the go and for items in motion, such as fitness monitors and navigation devices.

A motion coprocessor will make it easier for mobile devices to incorporate tracking of their own motion as well as that of peripherals into their computing. Using a coprocessor means there's less drag on (and power usage from) the main processor, so apps that use spatial sensitivty derived from motion can run all day -- even when the device is asleep. If you use GPS on your mobile device and see how it burns through your battery in minutes, you know avoiding that drainage is critical to making it a capability you'll leave turned on.
As motion processing is built into more devices, apps and peripherals that can take imaginative use of them will proliferate -- it's not just for runners and those trying to lose weight. Again, the iOS world will have a good year's lead time on this technology because motion processing is now standard in all new Apple devices, whereas only Motorola and parent company Google have it (so far) in the Android world.
3. Beacons

They're in your neighborhood Apple Store, and they're coming to sports stadia, shopping malls, and perhaps downtowns. These little devices use Bluetooth to communicate with your mobile device and a Wi-Fi or Ethernet connection to connect to the Internet, serving as an information waystation. That may sound like just a Wi-Fi access point, but it's not -- in fact, beacons aren't access points at all.

Instead, they're location-specific points of contact. That means they serve a small area -- Bluetooth's roughly 30-foot range -- to provide custom interaction related to that specific area. For example, a walking tour, zoo, or museum could use them to know what you're looking at and provide links to relevant details or to play an audio or video for that tour segment. A stadium could use them to know where you are so that the food you ordered gets to you faster or to tell you the nearest restroom's location. A store's online help or inventory system would know what department you're shopping in.
Beacons don't require interaction, of course -- they can simply record the Bluetooth network addresses of devices that come in range to build a model of foot traffic, where people tend to linger, and so on, all of which would be of great interest to retailers, urban planners, and police. But the interesting applications for individual users will involve websites and apps that interact with beacons to know where you are, then customize content and services accordingly. There's a lot of potential for innovation with beacons, as well as potential for marketing and other privacy abuses.
Apple is the power in beacons technology -- its iBeacons technology is in every iOS 7 device. iBeacons even lets iOS devices act as beacons (all the retailer iPads and iPod Touches now have a new use). But several companies sell stand-alone beacons, as well as beacons protocols and services that can be used in apps across multiple platforms. Some of those also use Apple's iBeacons protocols, of course.
Because Apple has by far the broadest beacon-capable user base, expect it to be the center of gravity for this technology. Again, expect Google to introduce a similar set of APIs and OS-level hooks in Android at some point.
4. Miracast

In March 2008, Apple reworked its failed Apple TV device to be a stand-alone media streamer for both local (iTunes) content and online (iTunes Store) content. In September 2010, Apple reworked its little-used AirTunes technology as AirPlay, allowing iOS and OS X devices to wirelessly stream video and audio content to the Apple TV and licensed AirPlay speakers. The combination of AirPlay and the Apple TV revolutionized media consumption, letting computers and mobile devices stream content to a variety of playback devices, as well as receive (in the case of iPad and iPhones) media from other devices. The technology has also gained traction in some businesses for conference room presentations.

But in the rest of the technology world, media streaming is a mess. The Android world has three types of physical video connectors in use (MHL, MiniHDMI, and Mobility DisplayPort), as well as two video-streaming technologies (DLNA and Miracast). Windows 8 uses WiDi, Intel's Wireless Display technology built into its current graphics coprocessors. Amazon.com's new Kindle Fire HDX tablets support Miracast.
Miracast promises to change that, though it had a rough start in 2013. Backed by the Wi-Fi Alliance that rendered the once-messy 802.11 protocols interoperable, Miracast is meant to make wireless video streaming interoperable across computers, mobile devices, and entertaiment devices like stereos, TVs, and speakers. Although Intel's WiDi incorporates the Miracast standard, many Windows PCs need driver updates to get Miracast support to actually work. The Kindle Fire HDX is certified with only one Miracast device, the Netgear Push2TV -- undermining the interoperability promise of Miracast. So far, only Google's and subidiary Motorola Mobility's recent Android devices support Miracast.
2014 will be the year that Miracast pulls together and delivers on its promise -- or follows the fate of DLNA, the clunky standard introduced in 2003 that often fails when mixing devices from different manufacturers and thus flopped in the living room.
5. MBaaS

It's one of the ugliest terms of tech today, and its meaning is highly variable and confusing, but mobile back ends as a service (MBaaS) is both increasingly important to developers and, I believe, about to go through a major shift. Forrester Research had a good explanation of MBaaS in 2012 when the term began to proliferate: middleware to data management and authentication services that mobile apps would need if the apps were part of a deeper data-driven service. Today, MBaaS is used to mean almost any cloud-resident service an app may need access to, such as video rendering, payment processing, location information lookup, and ad serving.

One sales pitch for today's expansive "any services" version of MBaaS is that mobile devices have too little processing power and storage capacity to do "real" computing, so they need an assist from the cloud. That's not true with the Apple devices and high-end Android devices from the last few years, of course, but it's true that tapping into the cloud provides an almost limitless set of capabilities that developers can use rather than re-create, allowing them to weave together more functionality.
If you use Chrome OS, the chrome browser, or Windows 8's Metro side -- or Web apps in general -- you already see that the mashup notion that briefly shone in the late 2000s is alive and well, but without that name. MBaaS is now effectively a services offering for functionality, rather than apps or infrastructure, that app developers will pull together no matter what devices their software runs on.
That's where I believe the MBaaS shift will occur in 2014. The "M" part will go away because the same logic applies to desktop and Web apps, too. The "B" part will also go away, because the notion of a back end is too confining and assumes a central data center model when in fact services (like APIs) will come from multiple sources and be federated. It's really just services, and they will enrich mobile apps even more.
In other words, MBaaS is going to yield to simply cloud-delivered services. That's why Software AG bought former mashup king JackBe, eBay bought PayPal, Facebook bought Parse, Salesforce.com's Heroku unit partnered with AnyPresence, and Google and Microsoft offer MBaaS functionality in their platform and infrastructure services.
This article, "The 5 mobile technologies to watch in 2014," was originally published at InfoWorld.com. Read more of Galen Gruman's Mobile Edge blog and follow the latest developments in mobile technology at InfoWorld.com. Follow Galen's mobile musings on Twitter at MobileGalen. For the latest business technology news, followInfoWorld.com on Twitter.



NEW Mini X-ray source with laser light


August 14, 2015
Max Planck Institute of Quantum Optics
Physicists have developed a method using laser-generated X-rays and phase-contrast X-ray tomography to produce three-dimensional images of soft tissue structures in organisms.
FULL STORY

Physicists from Ludwig-Maximilians-Universität, the Max Planck Institute of Quantum Optics and the TU München have developed a method using laser-generated X-rays and phase-contrast X-ray tomography to produce three-dimensional images of soft tissue structures in organisms.
With laser light, physicists in Munich have built a miniature X-ray source. In so doing, the researchers from the Laboratory of Attosecond Physics of the Max Planck Institute of Quantum Optics and the Technische Universität München (TUM) captured three-dimensional images of ultrafine structures in the body of a living organism for the first time with the help of laser-generated X-rays. Using light-generated radiation combined with phase-contrast X-ray tomography, the scientists visualized ultrafine details of a fly measuring just a few millimeters. Until now, such radiation could only be produced in expensive ring accelerators measuring several kilometers in diameter. By contrast, the laser-driven system in combination with phase-contrast X-ray tomography only requires a university laboratory to view soft tissues. The new imaging method could make future medical applications more cost-effective and space-efficient than is possible with today's technologies.
When the physicists Prof. Stefan Karsch and Prof. Franz Pfeiffer illuminate a tiny fly with X-rays, the resulting image captures even the finest hairs on the wings of the insect. The experiment is a pioneering achievement. For the first time, scientists coupled their technique for generating X-rays from laser pulses with phase-contrast X-ray tomography to visualize tissues in organisms. The result is a three-dimensional view of the insect in unprecedented detail.
The X-rays required were generated by electrons that were accelerated to nearly the speed of light over a distance of approximately one centimeter by laser pulses lasting around 25 femtoseconds. A femtosecond is one millionth of a billionth of a second. The laser pulses have a power of approximately 80 terawatts (80 x 1012 watts). By way of comparison: an atomic power plant generates 1,500 megawatts (1.5 x 109 Watt).
First, the laser pulse ploughs through a plasma consisting of positively charged atomic cores and their electrons like a ship through water, producing a wake of oscillating electrons. This electron wave creates a trailing wave-shaped electric field structure on which the electrons surf and by which they are accelerated in the process. The particles then start to vibrate, emitting X-rays. Each light pulse generates an X-ray pulse. The X-rays generated have special properties: They have a wavelength of approximately 0.1 nanometers, which corresponds to a duration of only about five femtoseconds, and are spatially coherent, i.e. they appear to come from a point source.
For the first time, the researchers combined their laser-driven X-rays with a phase-contrast imaging method developed by a team headed by Prof. Franz Pfeiffer of the TUM. Instead of the usual absorption of radiation, they used X-ray refraction to accurately image the shapes of objects, including soft tissues. For this to work, the spatial coherence mentioned above is essential.
This laser-based imaging technique enables the researchers to view structures around one tenth to one hundredth the diameter of a human hair. Another advantage is the ability to create three-dimensional images of objects. After each X-ray pulse, meaning after each frame, the specimen is rotated slightly. For example, about 1,500 individual images were taken of the fly, which were then assembled to form a 3D data set.
Due to the shortness of the X-ray pulses, this technique may be used in future to freeze ultrafast processes on the femtosecond time scale e.g. in molecules -- as if they were illuminated by a femtosecond flashbulb.
The technology is particularly interesting for medical applications, as it is able to distinguish between differences in tissue density. Cancer tissue, for example, is less dense than healthy tissue. The method therefore opens up the prospect of detecting tumors that are less than one millimeter in diameter in an early stage of growth before they spread through the body and exert their lethal effect. For this purpose, however, researchers must shorten the wavelength of the X-rays even further in order to penetrate thicker tissue layers.

Story Source:
The above post is reprinted from materials provided by Max Planck Institute of Quantum Optics. The original item was written by Thorsten Naeser.Note: Materials may be edited for content and length.

Journal Reference:
1.    J. Wenz, S. Schleede, K. Khrennikov, M. Bech, P. Thibault, M. Heigoldt, F. Pfeiffer, S. Karsch. Quantitative X-ray phase-contrast microtomography from a compact laser-driven betatron source.Nature Communications, 2015; 6: 7568 DOI: 10.1038/ncomms8568


This New App Lets Everybody Edit Photos Like the Pros

There are already plenty of iPhone apps that allow you to make simple adjustments to your photos, such as adding filters and effects, tweaking the color and brightness, etc. But there are few apps designed with professional photographers and artists in mind.
That's where Astropad comes in. The full version of Astropad lets you use your iPad as a graphics tablet, and now the company just launched a new version of the app for iPhone too.
The new app is called Astropad Mini, and it's going to cost $4.99 as part of its launch promotion on Thursday while the full price will be $9.99.
A graphics tablet is a pressure-sensitive slate that photographers and artists use to easily edit photos on their computers. It allows you to control your computer mouse with a stylus on the tablet so that you can touch up photos with more precision.
So, for instance, if you're editing an image in Photoshop, you'd be able to move the cursor to edit intricate details of the photo using a pen rather than a computer mouse or trackpad.
These tablets are usually pretty expensive, however. High-end graphics tablets, such as those made by Wacom that come with displays, can cost around $1,000.
That's why Matt Ronge and Giovanni Donelli, Astropad's two cofounders who previously worked as engineers at Apple, decided to make something cheaper that would work directly on your iPad. The iPad app debuted earlier this year, but the company said it heard a lot of requests for an iPhone version too.
"It's going to open up access to a lot more people," Ronge toldBusiness Insider.
Like a graphics tablet, Astropad works by connecting your iPhone or iPad to your Mac so that you can mark up images using the touch screen on your iPhone or iPad. And, since the Astropad app mirrors what's on your Mac's screen, you don't have to keep an eye on your Mac's monitor while drawing on the tablet like you would with a cheaper graphics tablet.
Here's a look at what the new Astropad Mini iPhone app looks like:
Image Credit: Astropad
Image Credit: Astropad
Ronge also says the user interface has been tweaked in the iPhone version of the app. The team didn't just shrink down the version made for the iPad — it had to "re-think" the entire format and make it simpler since it's designed to work on a smaller screen.
"We had to really boil it down to its essence, what the most important stuff is," he said.
In addition to cutting the app down to its core features, the team also added some new functionality. You can now program shortcuts to work on your Apple Watch. So, for instance, if you wanted to undo an action, you could do so from the Apple Watch rather than finding the submenu in the photo editing program you're using on your Mac.
Ronge is particularly excited about the rumors surrounding Apple's yet-to-be-announced iPhone 6S, which is rumored to come with the same Force Touch technology Apple introduced with the Apple Watch. He thinks a feature such as Force Touch could make an app like Astropad even more useful for photo editors.
"We would have pressure sensitivity right there on the iPhone," he said.

The Future of the Internet of Things Will Be 'Notification Hell' Before It Gets Better

Right now, the Internet of Things (IoT) is primarily the realm of futuristic-minded early tech adopters. Think of the pioneers who use Google Glass to snap pictures, Nest to control their home temperature or turn to their smartphone to dim the light bulbs in their bedroom. In a decade, things will look much different. By 2025, the Internet of Things will become more mainstream, having an economic impact between $3.9 trillion and $11.1 trillion per year, according to a recent economic forecast released by consulting firm McKinsey & Company. The upper figure (including the consumer surplus) could account for as much as 11 percent of the world economy, the report states.
That’s a pretty staggering estimate, especially given that today the Internet of Things is still “in the early stages of growth,” according to the McKinsey report. While IoT has a lot of potential, getting it to become a more established industry could be challenging.
Speaking on an IoT panel at the Northside Innovation Festival in Brooklyn, N.Y. earlier this year, director of technology at Ready Set Rocket, Gareth Price, asked the audience who was currently wearing a connected device (smartphones don’t count). A very few hands poked up. “This is like the hottest technology conference in one of the most connected cities in the world and there are maybe three people wearing a device right now,” said Price, proving his point that wearable technology is still the purview of a stark minority in 2015.

One of the primary reasons Internet-connected devices are still so rare is because often, each device requires its own separate operating software. Yet between 40 and 60 percent of the potential value of the IoT industry depends on devices being able to operate on the same software system, according to the McKinsey report. It’s onerous to manage dozens of software programs on your smartphone, along with all the push notifications and alerts. This is a fact of life that will likely get worse before it gets better, said Price.  “We are going to go through this phase where we have 10 to 15 different devices, and they will just be notifying the hell out of you, and you will have to turn them off manually.”
In addition to software applications needing to consolidate, for the Internet of Things industry to become more widely adopted, the buildings and city infrastructure we live and work in will have to become embedded with Internet connectivity for devices to seamlessly and continuously work.
“We need to be talking to architects and city planners about how to get off of the Internet and computer boxes and start thinking about our world as wired everywhere,” said Jocelyn Riseberg Scheirer, the CEO and founder of Bionolux, a company specializing in social wearable technology, who also spoke at the Northside Innovation conference. “That’s the only way that we are going to get bigger.“

The industry’s scalability will also be determined by a shift in our mindset. Indeed, as counterintuitive as it may seem, Internet of Things inventors and entrepreneurs envision a world where we spend less time glued to our smartphones, computer screens or other connected tablet. “We want to get to a place where we don’t have to sit on our phones to get these things to work,” said Jeff Bartenbach, the head of experience at Wink, an open software that connects to and controls multiple smart-home devices from one platform.
The long-term goal with inventors and entrepreneurs in the IoT space is pretty grand. By living in a hyper-connected world, we will all be more connected to each other, not less so -- as it sometimes seems in our current technology-driven world of smartphones, computers and tablets.
“The first round of the Internet is creating a very individualistic, detached experiences,” said Price. “In one way it has even taken us away from social networking. We are still in a box in a room. I think when the Internet becomes this kind of fabric on top of everyday life it will open up new opportunities to reestablish group and commonwealth experiences.”

Top AI Trends of the Week