Care Companion

Care Companion

For patients, keeping up with a special treatment plan and being educated about their illness is of vital importance. A new generation of applications encourages patients to take charge of their own health.

Read More

Using GPUs for training Tensorflow models

In recent years, there has been significant progress in the field of machine learning. Much of this progress can be attributed to the increasing usage of graphics processing units (GPUs) to accelerate the training of machine learning models. In particular, the extra computational power has lead to the popularization of Deep Learning – the use of complex, multi-level neural networks to create models, capable of feature detection from large amounts of unlabeled training data.
GPUs are so well-suited to deep learning because the type of calculations they were designed to process happens to be the same as those encountered in deep learning. Images, videos, and other graphics are represented as matrices, so that when you perform any operation, such as a zoom in effect or a camera rotation, all you are doing is applying some mathematical transformation to a matrix. In practice, this means that GPUs, compared to central processing units (CPUs), are more specialized at performing matrix operations and several other types of advanced mathematical transformations. This makes deep learning algorithms run several times faster on a GPU compared to a CPU. Learning times can often be reduced from days to mere hours. So, how would one approach using GPUs for machine learning tasks? In this post we will explore the setup of a GPU-enabled AWS instance to train a neural network in Tensorflow. To start, create a new EC2 instance in the AWS control panel. In this guide, we will be using Ubuntu Server 16.04 LTS (HVM) as the OS, but the process should be similar on any 64-bit Linux distro. For the instance type, select g2.2xlarge - these are enabled with NVIDIA GRID GPU. There are also instances with several of these GPUs, but utilizing more than one requires additional setup which will be discussed later in this post. Finish the setup with your preferred security settings. Once the setup and creation is done, SSH into your instance. Python should already be present on the system so install the required libraries:
sudo apt-get update
sudo apt-get install python-pip python-dev
Next, install Tensorflow with GPU support enabled. The simplest way is
pip install tensorflow-gpu
However, this might fail for some installations. If this happens, there is an alternative:
export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/gpu/tensorflow_gpu-0.12.1-cp27-none-linux_x86_64.whl
sudo pip install --upgrade $TF_BINARY_URL
If you get a “locale.Error: unsupported locale setting” during TF installations, enter
export LC_ALL=C
Then, repeat the installation process. If no further errors occur, the TF installation is over. However, for GPU acceleration to properly work, we still have to install Cuda Toolkit and cuDNN. First, lets install the Cuda Toolkit. Before you start, please note that the installation process will download around 3gb of data.
wget "http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/cuda-repo-ubuntu1604_8.0.44-1_amd64.deb"
sudo dpkg -i cuda-repo-ubuntu1604_8.0.44-1_amd64.deb
sudo apt-get update
sudo apt-get install cuda
Once the CUDA Toolkit is installed, download cuDNN Library for Linux from https://developer.nvidia.com/cudnn(note that you will need to register for the Accelerated Computing Developer Program) and copy it to your EC2 instance. Then:
sudo tar -xvf cudnn-8.0-linux-x64-v5.1.tgz -C /usr/local
export PATH=/usr/local/cuda/bin:$PATH
export  LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/cuda/lib64:/usr/local/cuda/extras/CUPTI/lib64"
export CUDA_HOME=/usr/local/cuda
Finally, the setup process is over and we can test the installation:
python
>>> import tensorflow as tf
>>> sess = tf.Session()
You should see “Found device 0 with properties: name: GRID K520”
>>> hello_world = tf.constant("Hello, world!")
>>> print sess.run(hello_world)
“Hello, world!” will be displayed
 >>> print sess.run(tf.constant(123)*tf.constant(456))
56088 is the correct answer. The system is now ready to utilize a GPU with Tensorflow. The changes to your Tensorflow code should be minimal. If a TensorFlow operation has both CPU and GPU implementations, the GPU devices will be prioritized when the operation is assigned to a device. If you would like a particular operation to run on a device of your choice instead of using the defaults, you can use “with tf.device” to create a device context. This forces all the operations within that context to have the same device assignment.
# Creates a graph.
with tf.device('/gpu:0'):
  a = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[2, 3], name='a')
  b = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[3, 2], name='b')
  c = tf.matmul(a, b)
# Creates a session with log_device_placement set to True.
sess = tf.Session(config=tf.ConfigProto(log_device_placement=True))
# Runs the op.
print sess.run©
If you would like to run TensorFlow on multiple GPUs, it is possible to construct a model in a multi-tower fashion and assign each tower to a different GPU. For example:
# Creates a graph.
c = []
for d in ['/gpu:2', '/gpu:3']:
  with tf.device(d):
    a = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[2, 3])
    b = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[3, 2])
    c.append(tf.matmul(a, b))
with tf.device('/cpu:0'):
  sum = tf.add_n(c)
# Creates a session with log_device_placement set to True.
sess = tf.Session(config=tf.ConfigProto(log_device_placement=True))
# Runs the op.
print sess.run(sum)
Next, we will take a closer look at the benefits of utilizing the GPU. For benchmarking purposes we will use a convolutional neural network (CNN) for recognizing images that is provided as part of the Tensorflow tutorials. CIFAR-10 classification is a common benchmark problem in machine learning. The task is to classify RGB 32x32 pixel images across 10 categories. Let’s compare the performance of training this model on several popular configurations:
IInstance type Price Time to complete training
Macbook Pro
-
21.6 hours
c4.xlarge(4xCPU)
$0.199 per hour
16.1 hours
c4.4xlarge(16xCPU)
$0.796 per hour
4.8 hours
g2.2xlarge(1xGPU)
$0.650 per hour
4.72 hours
g2.8xlarge(2xGPU out of 4)
$7.2 per hour
2.5 hours
g2.8xlarge(4xGPU out of 4)
$7.2 per hour
1.6 hours
Screen Shot 2017-02-08 at 15.34.59
As demonstrated by the results, in this specific example it takes the power of 16 CPUs to match the power of 1 GPU. At the time of writing, utilizing a GPU is also 18% cheaper for the same training time. References: https://www.tensorflow.org/get_started/os_setup http://www.nvidia.com/object/gpu-accelerated-applications-tensorflow-installation.html https://www.tensorflow.org/how_tos/using_gpu/ https://www.tensorflow.org/tutorials/deep_cnn/ http://www.nvidia.com/object/what-is-gpu-computing.html
Read More

DataArt Wins Microsoft Azure Certified ISV Solution Partner of the Year Award

New York, NY and St. Petersburg, Russia – June 16, 2016 – DataArt, the global technology consulting firm, announced today that it has won Microsoft Partner of the Year Award for Azure Certified ISV Solution in Russia, demonstrating excellence in innovation and implementation of customer solutions based on Microsoft technology.
Read More

Getting started with packages in DC/OS

Why would one create a package?

Once you get familiar with DC/OS, the open source project that was created by Mesosphere you get access to packages certified by Mesosphere. There are several ways to deploy your service into DC/OS: (1) use dcos marathon command in CLI; (2) use Marathon REST API directly; (3) deploy your service as a package. Using package approach makes your solution consistent with the environment and gives other benefits. Please refer to documentation regarding developing your custom services and packages https://github.com/dcos/dcos-docs/blob/master/1.7/usage/developing-services/service-requirements-spec.md. The aim of this post is to show how to create your own simple package and start using it in your DC/OS managed cluster.
Read More

On-demand Interactive Data Science with DC/OS

Traffic accidents in the UK, 1979-2004.

Whether you are a journalist, a researcher or a data geek, in order to start working with large data sets, you have to complete laborious tasks of setting-up an infrastructure, configuring an environment, learning new unfamiliar tools and coding complicated apps – with DC/OS you can start crunching those numbers within minutes.
Read More
IoT, Big Data, 5G and Virtual Reality – all a Reality at the Mobile World Congress 2016

IoT, Big Data, 5G and Virtual Reality – all a Reality at the Mobile World Congress 2016

On 27 February – 2 March, DataArt exhibited with Canonical at Mobile World Congress in Barcelona. The sheer scope of the world’s biggest mobile industry event was mind boggling – 100,000 attendees and 2200 exhibitors spanned nine halls and one dozen outdoor spaces at Fira Gran Via and Fira Montjuïc. DataArt demoed enterprise predictive maintenance IoT solution, enabling preventative, condition-based monitoring of a piece of manufacturing equipment. We used accelerometer-based sensors and an IoT gateway running Snappy Ubuntu Core to capture the vibration profile of a fan and analyzed it in AWS, to determine whether it’s in range of a normally operating equipment, and if not – to trigger a maintenance alert.
Read More

DataArt at MWC 2016

BarcelonaFebruary 22 - 25, 2016 For the second consecutive year DataArt will join Canonical at Mobile World Congress (MWC), the world’s largest gathering for the mobile industry, to demonstrate enterprise IoT solutions, big data, system integration and scalability. DataArt will showcase the following IoT and cloud systems running on top of Canonical’s  Ubuntu Snappy Core and Juju.
  • Software defined infrastructure and CPE
  • Home automation scenarios
  • Industrial automation and predictive maintenance
  • Scalable container-based cloud platform orchestrated by Juju
 For any questions or to schedule a meeting, feel free to contact us. Event details: Date: February 22- 24, (9am – 7pm), 25 (9am – 4pm) Location: Fira Gran Via, Av. Joan Carles I, 64 08908 L’Hospitalet de Llobregat, Barcelona Booth: Hall 3 – 3J30
Read More

Learn IoT with the Best

Saturday, January 16, 2016, 10:00 a.m. to 6:00 p.m. (EDT) DataArt and DeviceHive are the partners of a full-day online conference with one-to-one mentoring sessions specialized in IoT right in the comfort of your homes. Rafael Zubairov, our leading IoT expert, will tell you how to build the Open Source IoT Data Platform with the wide range of device integration options. Learn IoT With The Best is a full-day online conference tailored for developers who want to explore the IoT universe deeper. Right in the comfort of their homes, the participants will enjoy an empowering interactive experience through a friendly platform specially designed for this event. Certainly, the attendees will be supplied with a set of tools to interact with the best IoT experts:
  • a chatroom to ask live questions during the conferences;
  • Q&A forum after the conferences during which both experts and attendees can share knowledge and information;
  • one-to-one live mentoring sessions with the chosen expert(s);
  • downloadable presentations at the attendees’ disposal;
  • 6-month guaranteed access to the conferences, Q&A forum and one-to-one sessions.
More than 300 attendees are expected to listen and interact with at least 12 technical experts selected worldwide and coming from various backgrounds: IoT meetups speakers, startup CTOs, and Technical Evangelists in large groups such as Microsoft and Amazon. Useful information Official Website: http://iot.withthebest.com/ Date: Saturday, January 16th – 10am to 6pm (EDT) Venue: Online Twitter & Official Hashtag: @LearnWTB / #IoTWTB Themes: IoT / Wearables/ m2m/ IoT Cloud/ IoT Platform/ Smart Cities/ Connected Objects. 50% discount for the first 50 registrants from the DeviceHive community: DataartIOT Who will be speaking? Skilled developers, tech evangelists, CTOs in IoT startup and so on... Who can participate? Skilled developers, designers, data analysts, makers, startupers and entrepreneurs interested in IoT.
Read More

Inaugural New York Open Source IoT Summit a Resounding Success

DataArt, in partnership with Microsoft and Canonical, hosted its first annual Open Source IoT Summit in New York City. On November 12, 2015, six dozen technology innovators gathered at Microsoft’s New York Conference Center on Times Square to learn how they can develop their own in-house IoT solutions. DataArt has always been supporting open innovation movement, which is at the heart of new technology development, and our open source IoT device-management platform DeviceHive is a testament to that. DeviceHive runs on Canonical’s Ubuntu, is available on the Microsoft Azure Marketplace and provides the tools to solve any smart manufacturing or smart home challenge in-house, without costly investments in software solutions. At the summit, we showed how DeviceHive accelerates IoT product development, allows for creating a solution prototype in a matter of hours, and then deploying and scaling it to a limitless number of devices or control variables with no additional software or investments requirements. We walked the audience through the design, prototyping, deployment, and scaling up of a predictive maintenance IoT solution, enabling preventative, condition-based monitoring of a piece of manufacturing equipment. We used accelerometer-based sensors and an IoT gateway to capture the vibration profile of a fan and analyzed it in the Microsoft Azure Cloud using Juju, to determine whether it’s in the range of a normally operating equipment, and if not – to trigger a maintenance alert. Continuously monitoring manufacturing environments for hazards and having the option to prompt people (or even machines) to take corrective action to avoid damage or interruption, can significantly reduce manufacturing risks and costs. Device connectivity enables more than just monitoring and predictive maintenance, it ultimately allows for precise control and management of critical assets, automation of tasks and decision-making, and optimization of processes across the manufacturing value chain. That covers R&D, sourcing, production and outbound logistics which helps attain major reductions in waste, energy costs, and human intervention, leading to vast improvement in efficiency. While manufacturing is the area where IoT is an obvious game changer, IoT presents a rich opportunity for all areas of our lives. Examples include a heart monitor implant that alerts care providers of important changes in a patient’s heart condition, a car with built-in sensors that alerts the owner’s phone when tire pressure becomes low or emissions high, or precision farming equipment with wireless links to data from satellites and ground sensors that adjusts the way each part of the field is farmed based on different soil and crop conditions. IoT can be used to build a home automation system that customizes home devices to the habits of its residents, eventually enabling smart cities: monitoring customers’ power usage behavior, managing power demand and supply to optimize city-wide electricity usage, enabling remote monitoring and maintenance of gas pipeline networks, or installing billboards that assess approaching human traffic and change display messages accordingly. Connected devices are here to stay. Embracing the objects’ ability to sense their environment and communicate about it presents unprecedented opportunities and insight across industry sectors and processes. The greatest challenge ahead is learning to convert vast amounts of data into actionable insight, to make sense of complexity and respond to it swiftly, eventually enabling machine learning and minimizing human intervention. DataArt and its partners look forward to continued sharing of our experience with the IoT community. We welcome new partnerships to create value through new Internet-of-Things capabilities.
Read More
Enterprise Developers can’t miss the NY Open Source IoT Summit

Enterprise Developers can’t miss the NY Open Source IoT Summit

Hearing a lot about IoT lately? Want to learn everything from home automation to Industrial IoT? Want to try enterprise IoT solutions yourself? The Open Source IoT Summit is about open source IoT and Azure IoT solutions that anybody can use. Join Microsoft, Ubuntu / Canonical and DataArt to learn all about it and jointly start creating IoT solutions. Learn:
  • How to create and package enterprise IoT apps;
  • Monetizing IoT and selling IoT apps through IoT app stores;
  • IoT security;
  • Easily supporting different IoT standards;
  • How to connect IoT devices to the cloud and use Azure IoT services;
  • Open source tools to easily write and package IoT apps in any language;
  • Learn about DeviceHive, the open source IoT platform that greatly accelerates your IoT product development;
  • How to automatically test changes and roll them out securely in production;
  • Sample Industrial IoT solutions like open source predictive maintenance.
Bring your laptop, some IoT boards / toys / sensors and let’s start making great IoT solutions. Register now, free tickets are limited.
Read More
Open Source IoT Solutions on Azure

Open Source IoT Solutions on Azure

DataArt, the maker of DeviceHive, and Canonical, the maker of Snappy, Ubuntu and Juju, present Open IoT Solutions on Azure Events.

DataArt and Canonical are demonstrating industrial preventive maintenance and home IoT scenarios, that can be prototyped, scaled, and deployed. DataArt’s DeviceHive running on Canonical’s Ubuntu VM, are available on the Microsoft Azure Marketplace, providing accessibility to a flexible IoT platform. New bundled IoT solutions and examples, DeviceHive on Snappy (RPii), Data Analytics stack deployed by Juju, and Microsoft Azure services, will be discussed and demonstrated.

Want to receive updates? Subscribe here!

Ready to visit the event? Find the details here!

The event will take place at Microsoft's New York Conference Center, Central Park East (6th fl, 6501a). November 12, 2015. 1 pm - 5 pm. 11 Times Square, NYC.

Read More

Strata+Hadoop World NYC 2015 Reflections

Machine learning, cloud, visualization, Hadoop, spark, data science, scalability, analytics, terabytes, petabytes, faster, bigger, more secure, simply better. The kind of a merry-go-round that keeps spinning in your head after you spend three days on the exhibit floor at Strata+Hadoop conference. And lots of elephants, of course
Not only did we attend Strata with fellow colleagues from DataArt and DeviceHive, we also helped our friends at Canonical and brought our demo to their booth. Canonical was showing Juju: a cloud infrastructure and service management tool. We brought our favorite demo: industrial equipment monitoring rig. No PowerPoint slides, only real stuff. A Texas Instruments SensorTag’s accelerometer attached to a fan to monitor its vibration. To simulate the vibration we used a piece of duct tape attached to one of the blades to set the whole thing off balance. Sensor data was streamed using DeviceHive, generating time series data, which was aggregated by Spark Streaming and displayed on a nice dashboard. Everything deployed using Juju, working nicely in AWS. While the exhibition floor had a lot of great companies pitching their awesome products, I think the main highlight of this year’s event was Spark. Learning Spark, running Spark, managing Spark, using Spark for this and using Spark for that. Almost everyone, big or small, was talking Spark, integrating it into their solutions or making their data accessible through Spark. In just a few years Spark has proven to be a great platform for data discovery, machine learning and cluster computing in general. Spark ecosystem will keep expanding, changing the way we work with our data, increasing velocity of data-related projects. Next generation analytics tools will surely interface with Spark or rely on Spark, allowing enterprises to push the envelope of what can be derived from their data. Next generation parallel computing tools will bring business, engineers, data scientists and devops closer together. Databricks, a company commercially supporting Spark, was demonstrating their data analytics product which allowed to create research notebooks and interactively write Spark jobs, run them on AWS cluster, create queries and visualize data. On top of that add Spark Streaming and you can execute your models on a live stream of data. While Databricks is hosting the landing page for the UI, your data as well as the machine to host the infrastructure to run Spark resides in your AWS environment. I’m curious to know how it will compare with Amazon’s Space Needle they are unveiling at re:Invent 2015 in Las Vegas. Besides Spark, it is also becoming apparent, that working with data at large is no longer about a particular choice of a right database or distributed file system. Data platforms are coming. The world is starting to think in terms of data platforms: a set of technologies and architecture patterns designed to work together to solve a variety of data-related problems. Data platform largely defines how we access, store, stream, compute and search structured, unstructured, sensor generated data. A solid example of such a platform is Basho Data Platform where Basho is taking its Riak database and making it a part of something much bigger than a Key-Value store. Personal improvement takeaways:
  • Hack on public data in Spark
  • Keep learning and using Scala
  • Functional programming
  • Functional programming
  • Functional programming
Read More
IoT Solutions at Strata+Hadoop World NYC 2015

IoT Solutions at Strata+Hadoop World NYC 2015

DataArt will be showcasing Big Data, IoT and predictive maintenance solutions at Strata+Hadoop World NYC 2015, September 30 — October 1. Powered by Canonical's Ubuntu Snappy Core and orchestrated by Juju, we will showcase how to deploy DeviceHive's lambda architecture and evolve your industrial IoT solution from proof of concept to a scalable production system.

Stop by Canonical/Ubuntu Booth #358. If you would like to connect at the show, please leave your contact information here. Looking forward to seeing you at Strata+Hadoop World NYC.

Read More

KidPRO App — Improving Adherence in Clinical Trials

At DataArt our broad healthcare and life sciences experience coupled with our deep technology expertise allow us to clearly see the gaps in the healthcare technology field. For instance, there are many electronic patient recorded outcome systems (ePRO), and patient diaries used for chronic care management and clinical trials, but none of them has a patient interface engaging enough for adult patients, let alone children. By combining our vast expertise in user experience design, mHealth, and gamification, we have created a concept product that covers the current gap in pediatric mHealth solutions. Unlike existing products, DataArt’s concept application KidPRO uses these aspects to motivate young patients and turns managing their own healthcare or participating in a clinical trial into a fun and rewarding process.
Read More

DeviceHive Android BLE

DeviceHive, an Open Source IoT Data Platform with a wide range of device integration options, recently received an update for the Android Bluetooth Low Energy Gateway. This update includes support for extended range of GATT commands, support for short UUIDs of services and characteristics, and capability to connect multiple BLE devices simultaneously. This presentation shows common use-cases of this updated functionality.
Read More
Pills Adjutant — medication adherence tool for iOS and Android

Pills Adjutant — medication adherence tool for iOS and Android

The DataArt Orange team published a new project — Pills Adjutant. Staying on track with a treatment plan relies a lot on self-management. However, those who have to take medicine often find it difficult to recollect taking a dose at the right time, or even at all. With Pills Adjutant, medication adherence tool for iOS and Android, there is no need to remember. It helps you keep up with your medication schedule utilizing modern wearable devices, such as Apple Watch or similar for Android. See the full description and the screenshots on the project's page. The demo can be provided on request. Also, subscribe to our news below, we'll keep you posted.
Read More

Prototype IoT with $5 WiFi Microcontroller ESP8266 and DeviceHive

We are proud to announce a release of DeviceHive firmware that turns a $5 WiFi modem into a fully functional standalone IoT board that doesn’t require programming on device side, and you can access it’s GPIO, ADC, PWM, I2C and SPI from DeviceHive cloud. While being marketed as a WiFi modem which can be attached to a microcontroller, it is actually a fully capable IoT board by itself and doesn’t need any extra microcontroller. Using DeviceHive cloud you can access it’s pins right from your cloud apps using REST/JSON written in your favorite programming language. How about AngularJS UI that talks to this tiny thing? Untitled Feeling overwhelmed by Raspberry Pi or Arduino environment and need a cheaper alternative? Want a WiFi connected microcontroller that works with the cloud right away? Read on!
Read More
DeviceHive becomes a member of AllSeen Alliance

DeviceHive becomes a member of AllSeen Alliance

Along with companies like Microsoft, Cisco, Panasonic, Sony, and others, DeviceHive has become a member of the AllSeen Alliance. The main mission of the AllSeen Alliance is to enable widespread adoption and help accelerate the development and evolution of an interoperable peer connectivity and communications framework based on AllJoyn for devices and applications in the Internet of Everything.
Read More
DeviceHive Releases Version 2.0

DeviceHive Releases Version 2.0

We are proud to announce DeviceHive 2.0: faster, friendlier, more functional IoT Data Platform with a rich IoT Gateway framework. Get in touch with us if you want to learn more. Here are some of the key features included in this release.
Read More

IoT: Authentication Basics in DeviceHive

Hello to lovers of the IoT and M2M things. My name is Artyom Sorokin and I'm a software engineer who has gained great experience working and developing IoT projects with the help of DeviceHive! This is a series of posts where I am going to show you the Authentication and Authorization models which are available in DeviceHive and how to use them. In this tutorial we will get in touch with the basic auth approaches implemented in DH. If you are just looking for a complete list of approaches without examples and the long description, just proceed to the "Summary" section at the bottom of this post.
Read More
DataArt on the Leading Edge of Food Recognition

DataArt on the Leading Edge of Food Recognition

The DataArt Orange initiative spent lots of development efforts for its food recognition R&D project. And finally it feels that the market is ready for the technology. Last week Google announced at the Rework Deep Learning Summit an artificial intelligence project to calculate the calories in pictures of food you have taken. According to The Guardian, “the prospective tool called Im2Calories, aims to identify food snapped and work out the calorie content”. There is not much information about the project and what algorithms are available at the moment, but what is available indicates that Im2Calories will utilize a similar approach used by DataArt’s Computer Vision Competence Centre researchers in their Eat’n’Click project.
Read More

Thinking About the Future of Wearables Today

Igor Kozhurenko, VP of Research & Development at DataArt, was asked to comment on the Apple Watch release and the future of wearables for the Sprint Business blog.
"The launch of Apple Watch has meant that wearable technologies – and smart watches in particular – are getting a lot of attention. Business Insider estimates that the smart watch will be the leading product category on the wearable market and will account for 59% of total wearable device shipments this year, expanding to just over 70% of shipments by 2019. Paired with a smartphone, smart watches can offer rich functionality across a number of verticals like healthcare, travel, the smart home, IoT and capital markets. Their future, however, is tied to two factors:
  • The reliability of the hardware;
  • Stellar services that could empower the people wearing wearables.
Here’s what needs to be understood. Smart watches are not another type of a mobile device. They’re a new generation of technology, and call for a whole new approach to application design. For your app to thrive, you need to account for three basic realities:
  • The screen has a limited amount of space;
  • The user flow is entirely different from those of mobile applications;
  • Users expect a personalized experience.
DataArt design and R&D teams took all of the above into consideration when developing the approach for developing apps for wearables. We realized that whatever experiences the app is expected to deliver they should be achieved in fewer than three taps, which requires prioritizing notifications and assuring the app doesn’t become “spammy”, driving users away."
View original article or download PDF
Read More
Smart watch will win the market, and here’s why

Smart watch will win the market, and here’s why

The launch of Apple Watch has meant that wearable technologies – and smart watches in particular – are getting a lot of attention. Just think about this; according to Juniper research, over 70M fitness wearable devices are expected to be in use worldwide by 2018, up from 19M in 2014. On top of that, Business Insider estimates that the smart watch will be the leading product category on the wearable market and will account for 59% of total wearable device shipments this year, expanding to just over 70% of shipments by 2019.
Read More
DeviceHive at Microsoft Build 2015

DeviceHive at Microsoft Build 2015

We are glad to announce that the DeviceHive team together with Canonical/Ubuntu will support showcase IoT solutions using Ubuntu Snappy Core and Microsoft Azure at Microsoft Build 2015. If you would like to chat, please get in touch with us to schedule a meeting. Tickets for Microsoft’s annual developers conference were sold out in just an hour. It will take place in San Francisco on April 29th through May 1st. It’s really exciting to see where the company’s IoT vision will lead us. Microsoft CEO Satya Nadella said that “the new product, “Azure IoT Suite”, will combine business intelligence capabilities (Power BI) using real-time data (Azure Stream Analytics) with Azure Machine Learning capabilities”. Also he announced that Azure IoT Suite will be available as a preview later this year. While few details about the suite were provided, it will be designed to address various IoT scenarios “such as remote monitoring, asset management and predictive maintenance.”
Read More