10/11/17

Metadatos online y big data. Diferencias y ventajas cuando se complementan

Los metadatos online o in house son ignorados constantemente en los proyectos Big Data, y eso va en perjuicio del propio proyecto. Son ignorados porque muchas veces se da por sentado que la gestión de Big Data ya lo contempla. Pero esto no siempre es así y los metadatos online son cruciales para el éxito general de un proyecto Big Data así como para la organización de la arquitectura de datos de la empresa.

metadatos_online.jpg

Créditos fotográficos: Aleutie

También ocurre que algunas veces los metadatos online son confundidos con el Big Data. Es importante conocer la diferencia entre metadatos online y Big Data para determinar si el Big Data los contiene, si Big Data en realidad es solo una gran colección de metadatos o si es más que eso.

 

Descárgate aquí la guía

 

Hay algunas diferencias clave que vamos a ver.

 

Puntos de divergencia entre Big Data y los metadatos online

Igual que sucede con la aguja y el pajar, al comparar los metadatos online con Big Data se aprecia la diferencia, no sólo en cuanto a volumen, sino en cuanto a granularidad.

Los grandes datos son una colección de información de dimensiones incalculables que evolucionan a gran velocidad, esto dificulta su análisis y procesamiento, algo que requiere del uso de tecnología avanzada para descubrir tendencias y patrones. Por su parte, los metadatos son los detalles descriptivos de un activo digital individual. Gracias a su especificidad consiguen proporcionar información granular sobre un único archivo.

Sin embargo, hay que tener en cuenta que, cuando se crea un activo digital, también se están generando metadatos online sobre su origen, hora de creación, fecha y formato, entre otros. Y no basta con disponer de ellos para mantenerse organizado en la era digital, sino que, estos datos sobre los datos:

  • Deben estar debidamente nombrados, etiquetados, almacenados y archivados en un lenguaje consistente con otros activos de la colección. El principal beneficio de la consistencia en la administración de los metadatos online es que elimina la necesidad de tener una persona encargada de saber dónde está todo ya que, cuando la taxonomía es la misma el trabajo se simplifica y se puede hacer un aprovechamiento más eficiente de los recursos.
  • Necesitan que se lleve a cabo una gestión de activos adecuada, en base a una metodología que permita encontrar y distribuir los activos, permitiendo aprovechar todo su valor.
  • Hay que comprender los metadatos online, de igual forma que es preciso conocer la diferencia entre datos estructurados y no estructurados.

El entorno Big Data continúa expandiéndose rápidamente y, por eso, es crucial invertir el tiempo necesario en asegurarse de que los metadatos online del negocio están bien gestionados, ordenados y listos para su consumo.

Sólo de esta forma, estos datos descriptivos, administrativos y estructurales que definen los activos informacionales de la empresa podrán proporcionar el valor y el propósito del contenido de los datos, convirtiéndose así en una herramienta eficaz para localizar rápidamente la información, algo imprescindible para los análisis de Big Data y los informes de los usuarios empresariales.

 

New Call-to-action



via El valor de la gestión de datos http://ift.tt/2yN14S8

Cómo machine learning impulsa la productividad en la gestión de datos

Cómo machine learning impulsa la productividad en la gestión de datos

Machine learning es una técnica mediante la cual los programas aprenden iterativamente a partir de los datos en lugar de ser estáticos. Los sistemas de machine learning se utilizan para crear un modelo basado en aportación continua que se puede utilizar para hacer predicciones o tomar decisiones.



via El valor de la gestión de datos http://ift.tt/2hjYgko

Big Data’s Role in Marketing and Sales

big data roiWhat are significant regions of promoting and deals getting the vast majority of the assistance from Big Data information investigation? For Customer Analytics, 48% of the enormous information utilize cases exist while for the Operational Analytics, 21% of the Big Data information utilize cases are available according to the Forbes report. In a similar report, it was expressed that the Big Data utilize cases for Optimization of Enterprise Data Warehouse is 10%, New administration and item advancement is 10% and Compliance and Fraud are 12%. This examination uncovers that various are keen on using enormous information for client esteem investigation. There is no compelling reason to state that portable applications are assuming a noteworthy part in capitalizing on the enormous information to drive promoting and deals.

The Relationship between Big Data and Mobility

The connection between enormous information and portability is a promoter of common development and is proportional from numerous points of view. One needs to begin with the fundamentals to comprehend the profundity in this relationship. The rising significance of versatile applications advanced the exponential development of Big Data information both in assortment and volume and this supported the portable application examination to find out about the clients and customers.The same client information acquired from investigation and gadget sensors is encouraging the key basic leadership and investigation with more commonsense bits of knowledge. This is the motivation behind why portable is exceedingly suited for the entire Big Data information direction.

The versatility’s anyplace whenever nature helped the organizations secure more experiences on the client information in view of info, utilization designs, and the client conduct. Notwithstanding enabling the immediate access to information of the client, the applications that are running out of sight additionally grab a whole cluster of use examples and client information that permits additionally an incentive for examination with regards to the zone of promoting. This surge of fluctuated information comprising of client activities and conduct give versatile applications every one of the materials for more viable experiences to control the showcasing activities. At the point when clients are staying dormant without utilizing any versatile application, the gadget is proceeding to grab the profitable client information.

This immense save of portable client information is utilized further for motivation behind enhancing the versatile client encounters, to drive and additionally assemble the portable movement, to drive more client communication and engagement and to push the business transformation. The landing of progressive client investigation like Flurry’s examination is a fine case of how the application engineers can really take assistance from this information to guide client engagement and maintenance. Step by step the portable application promoting is winding up progressively subject to the significant examination offered by client bits of knowledge and information and in this regard, the estimation of Big Data information investigation is exceedingly developing.

With respect to versatile promoting and portable publicizing, the over the top measure of information driven bits of knowledge has a noteworthy part to play. Each versatile application gains area based notices and area sensors nowadays. Grabbing the client’s area information is constantly a major and pivotal piece of versatile Big Data information which empowers logical and very much focused on promoting approach. The continuous and especially focused on hyper-nearby notices and messages that don’t consider ‘where’ and ‘when’ component of the client circumstance in its extent of impact. The particular client information combined with area information can control more personalisation in advertising by means of portable applications.

Increasing Competitive Advantage with Real-Time Analytics Through Big Data

The significant investigation is really getting things going progressively in the current circumstances. The portable advertisers today are defenseless against the estimation of ongoing. With high volumes of information giving investigation more control than any time in recent memory, the constant examination can offer more focal points to the organizations. In this regard, the capacity of the progressed Big Data information innovation can assume a conclusive part. The Hadoop announcing empowers inspecting the information continuously. The organizations today can give speedy choices and react to changes in the immediate showcasing activities with the continuous investigation. Today, a business can acclimate to the adjustments amidst any crusade, on account of this constant preferred standpoint.

Basic Components in a Productive Data-Driven Marketing Strategy

As the centrality of information driven promoting started pushing numerous organizations to embrace to enormous information investigation and devices, there are some basic features that organizations need to remember. An information driven way to deal with promoting must be cross-disciplinary in any case thus you should support the cooperation among different groups and colleagues. In the second place, the attention must be on the privilege KPIs and significance must be given to the bits of knowledge between lines than just numbers. Every now and then, survey your purchaser personas on the premise of information driven bits of knowledge. The purchaser persona on which your business centers must be assessed and if necessary, more trademark subtle elements must be annexed to it.

Numerous organizations are using information driven promoting way to deal with contact their clients. An enormous 44% of purchaser showcasing organizations and organizations are utilizing Big Data information to make their promoting more client engaged and responsive. Every one of these measurements and information help us in thinking about the enormous importance of Big Data information in deals and showcasing. For any business now, while versatility is the inside phase of an advanced system, enormous information pushes the advantages of portability with more customized, hyper-nearby and continuous showcasing approaches.

The post Big Data’s Role in Marketing and Sales appeared first on Big Data Analytics News.



via Big Data Analytics News http://ift.tt/2xulYEY

How to Use Big Data to Your Competitive Advantage in Banking?

Internet of Things

Small Data is not Enough Anymore

The hyper connected world we live in generates over 2.5 billion gigabytes of data daily. This comes from social media, messaging, video uploads, and data created by sensors from IoT. Only in the US, there are over 700 million credit cards, and the average user makes about 20 transactions a month. When adding debit cards, reward cards, company transactions and overseas customers, it becomes clear that traditional spreadsheet environments are going to have a hard time keeping up. We can say that dealing with this problem using only small data tools is not enough anymore.

Bank-specific Uses of Big Data

Although big data can be useful in many verticals, the banking sector faces particular challenges related to risk management, compliance, and fraud, where big data can prove particularly valuable.

Real-time Risk Analysis

The banking sector is subject to the Basel III framework which aims to protect all banks from capital-related risks, credit risks, and financial risks to improve the banks’ abilities to counteract market shocks. Real-time analysis of transactions can help raise red flags when certain thresholds have been surpassed and warn banks to take appropriate measures. Big data is a suitable tool due to the high frequency of trading data and the sheer volume of it. Recent studies show an array of models that can be used to assess different types of risks, like systemic risk.

Fraud Detection

The beauty of big data analysis is that it can detect patterns and classify them into normal and abnormal. This is the cornerstone of fraud detection. Since identity theft, mortgage application frauds and credit card frauds are on the rise, banks need a tool to catch an early alarm call and to ensure fast resolutions. Credibility, capital, and reputation are at stake every time a bank is the target of an attack, since customers are anxious about the safety of their financial data. Big data helps create a “normal behavior” profile for each customer including geo location, withdrawn amounts, and payments made. Every time there is a transaction that doesn’t fit the bill, the user is alerted.

Compliance & Regulatory

The banking sector is subject to essential compliance regulations. The institutions struggle to remain within the guidelines while maximizing their profits. Big Data consultants from Itransition state that these instruments can offer a competitive advantage in analyzing investment portfolios and creating hedged versions while complying with the Dodd-Frank Act or similar local regulations.

General Applications Useful to Banks

Apart from the sector-specific applications, banks can employ big data in a similar way to retail chains, to increase the personalization of their offer and build a better relationship with the customers.

360-degree Customer View

A bank has access to a full user profile for each of their customers. The transaction history, the debt ratio, spending habits, all tell a story and can help the bank to define personas, understand preferences and lower churn rates. This is the foundation for personalization that drives loyalty. Banks should go beyond just adding the name to a greeting and use recommendation engines to present the customer only with the most relevant content. To save some advertising money, you can also build probabilistic models of interest in a specific product based on the customer’s lifestyle (traveler, parent, student).

Customer Segmentation

Before big data, segmentation was based on demographics, geographic location and, to put it bluntly, stereotypes. Right now, with advanced clustering algorithms (k-means, fuzzy clustering), an institution such as a bank or a credit company can create segments based on behavior, for example, credit card usage. Also, identifying the most valuable customers can be the base for personalized discounts and loyalty schemes.

Multichannel

Each customer has a preferred interaction channel, and sometimes they are not even aware of this fact. By studying behaviors retrieved from big data sources such as email logs, social media communication, and website traffic, banks can enrich each user’s profile with the preferred channel and use that one to contact them. The same user can switch between channels and only use some of them to convert, and all the other ones just to make an informed decision.

Chatbots

Customer care service is an integral part of the bank’s relationship with their customers. Offering multilingual, 24/7 service is expensive and unfeasible. Yet, chatbots, powered by big data retrieved from existing call-center logs, can help create conversational interfaces that respond to frequent and straightforward inquiries.

Reducing Costs

As in any business, innovation is even more intriguing if it brings the opportunity to reduce costs. Big data has been democratized, and there are no more financial barriers in using it. It has already crossed the chasm towards helping banks save money by analyzing performance, optimizing discounts and price offers, saving on customer service employees and even closing down unproductive departments or branches.

Examples

Churn rates affect banks’ profit and should be lowered. American Express has built a model with 115 variables that correctly pinpoint about a quarter of the accounts that will close in the following trimester.

Prejudice and bias can cost a company money and image. When the Bank of America took a closer look at what their customers used the loans for, they changed their marketing message from one directed to parents of college kids to a broader one which sold ten times more.

Nina, Aida, and Erica are chatbots already having thousands of conversations with bank customers, answering any queries about accounts, providing financial advice and saving the institutions they belong to thousands of dollars.

On a Journey to Simplification

As a paradox to the name, big data’s role in a bank is to simplify tasks. It should make processes as automated as possible, and offer actionable insights with just a few clicks, given that a well-designed dashboard is already in place. This is only possible with clean and organized data, even if it is unstructured. To take the full advantage of these new technologies, banks should rethink their data storage structures, dismiss silos and cooperate with branches and customers to get everything centralized.

The post How to Use Big Data to Your Competitive Advantage in Banking? appeared first on Big Data Analytics News.



via Big Data Analytics News http://ift.tt/2ixfwpU

7 Ways Big Data will increase Conversion Rate on your Site

big data conversion rateBig data is a vast collection of data that is collected from various traditional and digital sources. This data is a gold mine for marketing analysts and is a source of constant discovery and interpretation of consumer behavior. Nowadays, analysts are focusing on social media websites and online activities of customers, to collect big data.

But big data can only prove to be valuable once you realize the potential of ‘big-data driven marketing‘. This specific form of marketing can work wonders for your business. If you learn how to tackle big-data effectively, it can show instant results and improve overall customer engagement. E-commerce is a very dynamic way of selling online and you can achieve even more than your expected profits, if you learn to increase conversion rate optimization. This is when big data insights and analysis will come to your rescue and help to increase the conversion rate on your website. Here are some benefits of using big data, that can help you to reach your website conversion goals:

1. Helps create a Data-Driven Marketing Strategy

One of the basic ways in which big data can supplement your marketing strategy, is by converting it into a data-driven one. According to a research by Teradata, many enterprises are now focusing on big data to create more focused marketing strategies that generate increased revenue, margins and ultimately a lot more profits.

Big data insights are becoming increasingly valuable for businesses, because data driven marketing strategies are much more effective in reaching potential customers, hence increasing conversion rate on the website.

2- Allows you to understand your customer

E-commerce consumers are becoming increasingly aware and unless you treat them like royalty, it is difficult to optimize your conversion rates. According to a research, if you use big-data analysis to its full potential, you can increase your sales by 60%. This is why it is important to use big data and understand your customer so that you can have a customized one-on-one conversation with them. It not only reassures the customer that they are being taken seriously, but will also help you design your sales pitch according to the interests of the buyer. You can suggest offers and services according to the requirement of the consumer. So, big data is an accurate way to understand, attract, nurture leads and guide your customers along their e-shopping journey.

3- Helps evaluate your product

Big data can inform you about how well your product is being perceived in the real world. If you have good or bad reviews on social media, you can access them and learn a lot about your product. Insights by customers can help you re-develop your products, so that they evolve constantly. If you are selling a top-notch product, people are automatically going to be driven towards your website, resulting in a higher conversion rate.

4- Keeps your data up-to-date

Big data tools such as CRM programs have the ability to keep your data safe and organized at all times. You can clearly identify any internal security threats and fix them immediately. A sales pipeline template also shows potential leads and gives you a clear idea of sales forecast, enabling you to design future marketing strategies accordingly.

5- Enables better pricing decisions

One of the most important factors that can affect the conversion rates of your website, is market competitive pricing of products. If your products are better in terms of pricing, as compared to other e-commerce websites, consumers will definitely buy from you. But the process of checking each competitor’s website for prices, is exhausting. This is when big data can be of great help.

Most online shoppers browse through a lot of similar online stores and use price comparison engines to help them decide which item to buy. This is why, if you properly analyze each customer’s big data, you can automatically get an idea of market prices and the season’s demanded products.

6- Suggests relevant product recommendations

As discussed above, it is extremely important for the customer to have a personalized shopping experience on your website. If a customer gets relevant product recommendations, that are based on their interests, there is a lot more chance for the lead to convert into a successful sale.

Big data uses the consumers online activity to predict the user’s next action. By understanding the buyer’s profile, big data offers relevant suggestions to the customer, that will ensure a higher conversion rate for the website.

7- Increases customer responsiveness

A research by the Aberdeen group suggests that responsive websites have an increased visitor-to-buyer conversion rate of up to 11% as compared to non-responsive websites. Big data can help you maintain a highly responsive website by using nonrelational databases that give accurate prediction of future customer activity.

This is why big data plays a vital role in achieving increased customer responsiveness and gaining better customer insights. According to a Forrester study, 44% of marketing analysts are using big data to improve responsiveness. Apart from this, more than 36% of them are using data mining to achieve better customer insights that can help them plan more relationship-driven strategies.

Conclusion:

Apart from the above-mentioned benefits of big data, it has also revolutionized the way SEO affects your business. Big data has made it very easy for search engines to filter content and deliver relevant results to the user. Hence, if businesses incorporate SEO within their online content correctly, it will generate a lot of traffic on their website.

Big data is a gold mine of information that can change the fate of your online business, if used intelligently. But ironically, less than 0.5% of the data is accurately analyzed and converted into actions. Once online businesses realize its true potential, big data will become an irreplaceable part of their strategies.

The post is by Audrey Throne, a mother of a 2-year old and a professional blogger by choice. Throne is passionate about health, technology and management and blogs frequently on these topics. Find her on Twitter: @audrey_throne.

The post 7 Ways Big Data will increase Conversion Rate on your Site appeared first on Big Data Analytics News.



via Big Data Analytics News http://ift.tt/2lSLhLm

17/8/17

The Important Attributes of Big Data to the Travel Industry

In this contributed article, Ritesh Mehta, Senior Technical Account Manager for TatvaSoft Australia, discusses the impact big data is having on the global travel industry and offers 5 action points for industry players to adopt in order to capitalize on big data.

via insideBIGDATA http://ift.tt/2x4vERZ

6 razones para implantar soluciones líderes en integración y calidad de datos

Una buena gestión de la calidad de datos es el principal factor de éxito en un proceso de integración de datos. Podría ser considerado como el primer paso del proceso de integración y la clave para conseguir que los datos generen rentabilidad.

Líderes en Integración y calidad de datos

Las herramientas de business intelligence se basan en gran medida en paneles de control y herramientas analíticas que necesitan integrar datos desde diversas fuentes. Pero siempre, antes de esta integración necesitamos gestionar la calidad de esos datos para asegurarnos que la salida de la herramienta BI sea fiable y nos proporcione ventajas sobre nuestra competencia.

 

Descárgate aquí la guía

 

Vista la importancia de un trabajo coordinado y efectivo entre herramientas de integración de datos y de calidad de datos, parece claro que el uso de soluciones líderes puede ser fundamental para garantizar el éxito de estos proyectos.

Veamos algunas razones para implantar este tipo de soluciones líderes.

 

3 razones para utilizar una herramienta de Integración de datos líder

Los líderes en el mercado de herramientas de integración de datos son compañías que tienen muy clara la enorme relación entre los datos y la integración de aplicaciones y que pueden trabajar con sencillez en implementaciones independientes de la ubicación, las cuales no se limitan sólo a la nube ni a implementaciones locales, sino que pueden desplegarse más allá de una ubicación específica. Además, se adaptan rápidamente para reconocer y diseñar el despliegue de las nuevas y emergentes demandas del mercado, a menudo proporcionando nuevas capacidades funcionales en los productos antes de que exista una demanda, e identificando nuevos tipos de problemas de negocio a los cuales las herramientas de integración de datos pueden aportar un valor significativo. Tienen una presencia en el mercado establecida, un tamaño significativo y una importante presencia multinacional.

Podemos agrupar y enumerar las tres principales razones para utilizar herramientas líderes en integración de datos de la siguiente manera:


  1. Se adaptan rápidamente a las demandas de funcionalidad del mercado. El desarrollo de productos y la hoja de ruta de un líder en integración de datos abarca diversas capacidades, incluyendo ETL por lotes, integración en tiempo real, compartición y virtualización de datos. La fuerte interoperabilidad y las sinergias entre las herramientas de integración de datos y otras tecnologías de una empresa líder fomentan su uso como un estándar empresarial. El énfasis en el apoyo a la integración de datos tanto digitales como de IoT, iPaaS, preparación de datos de autoservicio, big data, gobernabilidad de datos y seguridad de datos capitaliza las tendencias de la demanda.
  2. Fuerte apuesta por el data management y roles no técnicos. Las herramientas líderes tienen amplias funcionalidades de usuario no necesariamente técnico para la preparación de datos de autoservicio. Se ofrece tanto en local como en implementaciones cloud o big data, normalmente mediante aplicaciones Data Wizard. Hacen énfasis en funcionalidad colaborativa orientada al usuario empresarial no técnico y a la agilidad de la infraestructura de integración de datos como un estándar empresarial.
  3. Amplia presencia en el mercado y foco dedicado a la innovación. Las herramientas líderes tienen una amplia red global de partners, revendedores, grandes integradores de sistemas y proveedores de servicios externos que ofrecen un amplio soporte a la implementación. Esto asegura la implementación en prácticamente cualquier ubicación.

3 razones para utilizar una herramienta de Calidad de Datos líder

Los líderes en herramientas de calidad de datos disponen de una gama completa de funciones de calidad de datos, incluyendo perfilado, parsing, normalización, matching, validación y enriquecimiento. Son empresas con una clara comprensión y estrategia para el mercado de calidad de los datos, usan ideas innovadoras y diferenciadoras, y aportan productos innovadores al mercado. Son herramientas que abordan todas las verticales, geografías, dominios de datos y casos de uso. Las capacidades ofrecidas en estos productos incluyen el reconocimiento de problemas de calidad de datos multidominio, opciones de despliegue tales como SaaS, soporte de autoservicio para roles como administradores de información, funcionalidad de preparación de datos para usuarios empresariales, uso de machine learning y algoritmos, soporte de calidad de datos para IoT, soporte para un modelo de gobierno basado en la confianza. Tienen una presencia establecida en el mercado, un tamaño significativo y alta presencia multinacional, ya sea directamente o a través de una empresa matriz.

Podemos agrupar y enumerar las tres principales razones para utilizar herramientas líderes en calidad de datos de la siguiente manera:

  1. Innovación y estrategia de producto. La estrategia de producto de de los líderes en herramientas de calidad de datos se basan en plataformas de datos inteligentes. Utilizan matching learning, algoritmos y analítica predictiva para abordar escenarios emergentes, como el IOT, el análisis de big data, data governance y el análisis de datos basado en contenido.
  2. Tecnología empresarial orientada a empresas. Las capacidades de calidad de datos de las herramientas líderes responden a las necesidades de los principales roles empresariales, como el administrador de información y los analistas de datos, a la vez que proporcionan la profundidad y la escalabilidad de la empresa necesarias para los roles técnicos.
  3. Comprensión del mercado y fuerte estrategia de marketing y ventas. Las herramientas líderes crecen fuertemente, respaldadas por su profunda comprensión del mercado de la calidad de datos y su capacidad de predecir y adaptarse a los cambios del mercado. La comprensión del mercado de estas empresas está altamente correlacionada con su estrategia de ventas y marketing, y con la ejecución del mercado en circuito cerrado.

 

New Call-to-action



via El valor de la gestión de datos http://ift.tt/2vGj0un

14/8/17

El éxito de una empresa impulsada por datos y el peligro de TI no alineado

Cuando una organización es capaz de ejecutar de manera correcta una estrategia de datos, la agilidad y la capacidad de respuesta de la empresa junto a una clara ventaja competitiva, son los principales beneficios que puedenesperarse.

empresa impulsada por datos.jpg

Por el contrario, aquellas empresas que le dan menos importancia estratégica a sus datos, se encuentran con un panorama de menor crecimiento de ingresos y una competitividad más baja.

 

Descárgate aquí la guía

 

Sin embargo, el aspecto más importante de todo esto es entender que la falta de alineación entre TI con el resto de personal de la empresa, en lo que se refiere a estrategias de datos, puede hacerlas fracasar, minimizando el crecimiento esperado.

 

Las ventajas del uso inteligente de los datos

Utilizar los datos de forma inteligente hace que las empresas aumenten sus ingresos. Es lo que se desprende del informe “The data directive”. Según este informe, el 81% de las empresas de gran crecimiento utilizan los datos mientras que entre las de bajo rendimiento solo el 57% utiliza los datos.

Además, estas empresas de alto rendimiento suelen tener iguales casi todos los elementos de sus procesos de toma de decisiones por lo que es muy probable que estas organizaciones hagan caso a los resultados de esos datos que tienen.

Igualmente:

  • La mayoría de estas empresas entregan a sus directivos datos en los que poder basar sus decisiones. Lo hace el 92% frente a solo el 35% de las ineficaces.
  • La probabilidad de que consideren que su planificación estratégica y sus decisiones están basadas en datos es 12 veces más alta.
  • Es el director general quien lleva las iniciativas que tienen relación con los datos en lugar de ser el director de TI.

El problema de TI no alineada

El problema de alineación con iniciativas de datos en el departamento de TI viene más por parte de la plantilla básica de TI que por parte de los directores del departamento. Estos últimos sí ven lo importante que es colaborar con el resto de usuarios de negocio mientras que los trabajadores básicos son bastante menos conscientes de la influencia de los datos en el negocio.

De hecho, los trabajadores de TI están mucho menos predispuestos a colaborar. Eso se demuestra mediante el dato de que solo un 17,2% de los trabajadores no ejecutivos realiza consultas periódicas a líderes sobre estrategias de gestión de datos.

Y el problema es que se trata del personal que se encarga de administrar los datos y eso afecta a la capacidad de las empresas para conseguir una utilización eficaz de los datos. Algunos estudios adicionales indican que a pesar de que el uso de los datos es una prioridad estratégica para los CIO, cerca del 50% de ellos dice estar cansado de perseguir a sus trabajadores para que se centren más en los clientes y el negocio.

A pesar de esto conviene resaltar que sí existe alineación en cuanto a la idea genérica de que tener una estrategia de datos eficaz puede suponer tener una ventaja respecto a la competencia. También están de acuerdo en que los datos pueden ayudar a conseguir más medios para los usuarios internos y también en que las técnicas que se utilizan para gestionar los datos no son suficientemente adecuadas para satisfacer las necesidades de sus empresas.

Podemos concluir por tanto que parece conveniente formar a los empleados acerca de los efectos que su trabajo puede tener en sus empresas. Y además estos datos parecen indicar que las organizaciones impulsadas por datos coinciden en su diseño en términos como que esta estrategia debe estar unida a KPI concretos de negocio.

 

 Cómo instaurar una cultura DATA-DRIVEN en mi empresa



via El valor de la gestión de datos http://ift.tt/2uHnaQ5

La importancia de la seguridad e integridad en base de datos

Cuando hablamos de integridad en base de datos nos estamos refiriendo a la completitud, la exactitud y la coherencia del conjunto de datos de una base de datos. Podemos tener una percepción de esta integridad en base de datos cuando vemos que entre dos instancias o entre dos actualizaciones de un registro de datos, no hay ninguna alteración, lo que significa que los datos están intactos y sin cambios.


integridad_en_base_de_datos.jpg

Créditos fotográficos: JohnDWilliams

Sentamos las bases de la integridad en base de datos durante la fase de diseño de la database, a través del uso de procedimientos y reglas estándar. A partir de ahí, podemos seguir manteniendo la integridad en base de datos mediante el uso de métodos de comprobación de errores y procedimientos de validación.

 

Descárgate aquí la guía

 

El concepto de integridad en base de datos garantiza que todos los datos de una base de datos pueden ser rastreados mediante técnicas de trazabilidad, así como conectarse a otros datos. De esta forma se asegura que todo se puede buscar y recuperar.

Tener un sistema de integridad en base de datos único, bien definido y bien controlado aumenta la estabilidad, el rendimiento, la reutilización y facilita el mantenimiento.

 

Qué es la seguridad de la información y qué tiene que ver con la integridad en base de datos

La seguridad de la información se ocupa de proteger la confidencialidad, disponibilidad e integridad en base de datos de todos los activos de conocimiento de la organización. La forma de lograrlo tiene que ver con:

  • Confidencialidad: se trata del aspecto más importante de la seguridad de base de datos. Este objetivo se alcanza a través del La encriptación ha de aplicarse a datos en reposo, pero también a los datos que, por un motivo u otro, se encuentren en tránsito.
  • Integridad en base de datos: busca garantizar que sólo las personas autorizadas a ello podrán acceder a información privilegiada de la empresa. La integridad de una base de datos se aplica a través de protocolos de autenticación, políticas internas (como las que impulsan la seguridad de las contraseñas) y un sistema de control de acceso de usuario que define los permisos que determinan quién puede acceder a qué datos. Tampoco puede olvidarse el tomar medidas que ayuden a conseguir que las cuentas no utilizadas queden bloqueadas o sean eliminadas.
  • Disponibilidad: hace referencia a la necesidad de que las bases de datos y toda la información que contienen estén listas para su uso. Por una parte, se debe garantizar su funcionalidad y confiabilidad mientras que, por otra, es recomendable planificar los tiempos de inactividad fuera del horario laboral.

Garantizar la integridad en base de datos, así como su disponibilidad y confiabilidad es determinante para el buen funcionamiento del negocio. Sin embargo, la amenaza no da tregua y, a día de hoy, los ataques se multiplican, tanto en frecuencia, como en objetivo. Los piratas informáticos ya no codician sólo los activos informacionales de las grandes corporaciones multinacionales, sino que tienen en su punto de mira a todo tipo de empresas, independientemente de su tamaño, propósito o industria.

 

Tipos de ataques a la integridad en base de datos

Está claro que el riesgo implícito en este tipo de acciones maliciosas varía de una organización a otra, aunque entre los ataques más comunes se encuentran los que tienen como objetivo:

  • Datos personales de clientes, números de tarjetas de crédito y seguridad social.
  • Detalles estratégicos del negocio.
  • Información financiera de la propia compañía y de sus socios.
  • Datos sensibles acerca de los empleados.

Podría decirse que se trata de la mayoría de las bases de datos activas en los directorios de la empresa, al menos, todas las que, de alguna forma, resultan relevantes para el negocio. Precisamente por ello, es necesario mantener sólidas prácticas de seguridad y estrategias de defensa que permitan combatir este tipo de ataques, también en sus versiones más recientes y sofisticadas, como el phisinig, el spear phising, la inyección SQL, el DDos, la amenaza persistente avanzada o el ransomware.

Según la Encuesta de Amenazas de Inyección SQL de Ponemon, “el 65% de las organizaciones encuestadas habían experimentado un exitoso ataque de estas características en el último año”. Entre las causas que podrían haber motivado la vulneración de la integridad en base de datos se encuentran la falta de escaneo de database o su escaneo irregular, un error común en el 47% de los casos.

Se trata de un dato sorprendente, sobre todo si se tiene en cuenta que, el 49% de los encuestados calificaron el nivel de amenaza de una inyección de SQL en su organización con una puntuación de 9 o 10.

Sin embargo, no hace falta ir tan lejos, la autenticación débil es la amenaza más común a la seguridad y la integridad en base de datos. Una misma contraseña usada con fines distintos, compartida entre usuarios, que nunca se actualiza o que resulta obvia facilita el trabajo de un atacante malintencionado en su misión encaminada a robar la identidad de un usuario legítimo. Una vez que conoce esos 8, 10 o 12 dígitos, ya tiene acceso a datos confidenciales, ya tiene a la organización en sus manos.

 

Mejores prácticas en seguridad que ayudan a garantizar la integridad en base de datos

Una de las formas más efectivas de garantizar la integridad en base de datos es implementando algunas de las mejores prácticas de seguridad. Entre ellas se encuentran las siguientes:

  • Recurrir al enmascaramiento de datos o permitir a los usuarios acceder a cierta información sin poder verla ayuda a mantener la confidencialidad incluso en entornos de pruebas.
  • Minimizar los extras y limitarse a los servicios, aplicaciones y funcionalidades que realmente son necesarios para asegurar el normal funcionamiento de las operaciones del negocio, de esta forma se reduce el riesgo.
  • Asegurarse de que los administradores de la base de datos entiendan la importancia de garantizar su protección.
  • Mantener actualizadas las bases de datos y eliminar los componentes desconocidos.
  • Recurrir a herramientas como el análisis de código estático, que ayudan a reducir los problemas de inyección de SQL, desbordamiento de búfer y problemas de configuración.
  • Hacer copias de seguridad frecuentes y emplear una fuente de alimentación ininterrumpida o SAI que garantice que un corte de energía no causa la pérdida de datos.

 

Integridad de datos



via El valor de la gestión de datos http://ift.tt/2w6VWFW

5 Aspects of Modern Society Now Using Data Analytics To Their Benefit

big dataIn a world so heavily dependent on technology, it is no surprise that new predictive tech and the ability to finally visualize big data in a fully interactive way are at the forefront of many of our industries today, creating what is fondly referred to as “the analytics bandwagon of 2017”. However, not every one of these industries which are now utilizing big data analytics to their benefit originally seemed to need said data visualization to succeed.

In fact, many of these industries may be surprising at first, but the outcome of this use of data analytics has taken them from their original spot in the past and catapulted them into the future of technology in business. With this said, five particular industries have truly grown the most since incorporating data analytics appropriately, and, as data scientists, it is highly important that we recognize this in order to be ahead of the data trends that matter the most. After all, trends are the foundation on which predictive data analytics are based and with which this future is built.

1. Small Business

For years, small business has primarily focused on small amounts of data analysis to succeed and simply learned how to recover after small business failures rather than how to succeed despite the odds. However, with tech giants such as IBM, SAS, and Microsoft offering affordable, cloud-based data-crunching services, it has become easier than ever for small businesses to cross-reference their data with the ever-expanding big data collections gathered through the analysis of social networks, government databases, and usage patterns on mobile devices, among other things.

With this said, the ability to tap into the big data of businesses worldwide in an affordable manner has allowed for these small companies to be able to not only review their own data but compare it to their competitors in order to predict current trends, find flaws in their current business model, and improve accordingly. This enables them to make more money, become a larger business, and be as, if not more, successful than other businesses in their given industry. By doing this, they are also able to find new trends in customer interaction and utilize new forms of technology, such as Javascript webcam and video ingest API, in order to interact with their client base in a more effective manner. This also reduces costs in human-to-human interaction which reduces call centers and increases profitability. Furthermore, it allows for customers to still feel like they are being interacted with in a personal manner as well.

Although these are all wonderful ways to utilize this data, small companies have also been able to use other sources, such as recorded calls and other forms of consumer behavior analytics, to cross-reference their internal-pricing histories, customer traffic patterns, and purchasing trends. By doing this, companies are able to predict customer needs before they arise, create targeted marketing campaigns, price items accordingly, and remove the human bias from small business in order to become more efficient and less opinion-based.

Furthermore, there are multiple different data analytics resources for small businesses to utilize nowadays that make it even more efficient than ever before. With this said, many small businesses turn to the internet and various sources of big data analytics in order to find their flaws, find the trends they seek, and analyze the aspects of their customer relations necessary to make themselves as efficient and successful as the companies they compare themselves to online. From there, they can finally take the guessing out of running a small business and know exactly what their customer wants and what their competitors offer with very little money invested whatsoever in the process.

2. Education Sector

With the dependency on technology consistently growing, more and more individuals have begun to recognize the advantages of online learning which has led to a tremendous boost in online students worldwide. However, it can often be difficult to analyze the data received from these online classes in order to improve as these students are located all around the world and have different study patterns and success rates. However, it is important that these online courses consistently improve as their onsite counterparts do, which creates a certain disparity in the education world regarding online classes and their success rates over time.

Therefore, it is important that these courses, along with their teachers and respective schools, consistently review any and all data they receive in order to find ways to improve and help students of all learning and success levels. For this, online teachers have begun to analyze time-based analytics, discussion board trends, and individual assignment success rates in order to analyze not only how students work but also when they work, as well as how they feel and interact with other online students.

By doing this, teachers have been able to recognize when to send out assignments based on the days that students are the most successful and engaged, tailor assignments according to what these individuals are successful in or lacking in, and also discuss various subjects with classes based on their opinions of certain assignments likewise in order to help these online students succeed. It also helps teachers disseminate all the information they actually need and learn the information that they need to be successful in the field they hope to soon be a part of. In the end, this allows for educators to improve online courses through data analytics and compile the data for future classes in order to analyze how successful not only the school is but certain generations as well.

3. Insurance Industry

With an industry that is heavily invested in data analytics such as the insurance industry, it comes as no surprise that insurance and technology tend to grow side by side in most cases. In fact, although a large portion of the insurance industry was unaware of just how valuable their analytics were, many insurance companies have begun to provide this information to their counterparts in order to not only help them recognize how implementing analytics for tangible results can increase their business but also to create an industry that is connected and ever-expanding likewise.

Furthermore, by using data analysis to their benefit, these companies are also able to tap into big data analytics and compare their analytics accordingly. Through this process, entrepreneurs can streamline costs, be more targeted with the risks they want to take in business and the demographics and consumers they reach, identify new customers, and predict fraud — which is a major issue in the insurance industry as of late. Therefore, the insurance industry could not only learn how to be more productive and transition their efforts from agents and distribution channels to end customers but also learn how to prevent some of the major data security concerns in their industry through fraud prevention..

This ultimately means that data itself could actually be able to predict its insecurity and oust hackers before they ever have the chance to hack the companies they intend to. On top of this, by gathering data from their companies regarding telematics devices, smart phones, social media, CCTV footage, electoral rolls, credit reports, website analytics, government statistics, and satellite data, they will soon be able to also recognize consumer trends. Not only will this make insurance easier to receive but also make it cheaper for consumers as well, which will help the company to make their guest services more automated and reduce costs of person-to-person interaction. With all of this in mind, it is clear that data analytics have a definitive position in the insurance industry and will continue to improve insurance and its efficiency each and every day as well.

4. Travel Industry

The travel industry is one that is heavily focused on convenience, affordability, and comfort. Because of this, it is highly important that anyone in the travel industry pays close attention to their convenience analytics, as well as their budget analytics of their business, to provide a more affordable option than their competitors and a more convenient one as well.

In fact, by analyzing trends in their competitors reviews, budgets, consumer interactions, and site UX, these companies can provide a more convenient and affordable option that will ultimately attract more people. Furthermore, they can also look into their competitors demographic in comparison to the things that entice said demographic in universal big data so as to know how their competitors are marketing and how they should do so to be more enticing to the relevant demographic.

On top of this, these companies can also create new marketing strategies using predictive analytics trends in order to attract new potential markets likewise. For instance, many companies in the travel industry tend to market themselves to young adventurous demographics. Despite this, a large part of the market is lost as retirees also love to travel to foreign destinations as well. Therefore, by recognizing this trend, these companies can target more than one demographic in an efficient way and not only increase their reach but also their profitability over time.

Furthermore, many travelers have begun to recognize the benefits of medical tourism as well. Medical tourism is when a patient travels outside of the country or state in order to receive medical care. With so many stipulations on medical care now becoming major issues in America, this option has become more and more enticing to individuals dealing with medical issues that are often uninsured or extremely expensive in the states.

Therefore, by analyzing the medical tourism analytics involved in medical travel, travel companies can also find ways to market towards these individuals specifically — in order to open up their company to an entirely new and unique market and their trends before anyone else. Through this, these travel companies can ultimately create a far more profitable and inclusive market which will not only help the travel economy but also promote global travel for all as well.

5. Healthcare Industry

Although medical tourism is definitely one aspect of the healthcare industry being significantly affected by analytics, there are so many more that. By acquiring the right forms of data, can allow the healthcare industry to be able to provide more informed care, predict future outbreaks or disease, and take preventive care to an entirely new level as well.

On top of this, this will ultimately allow healthcare professionals to predict and prevent fraud which will further secure online patient records. With the recent WannaCry ransomware attack attacking the healthcare industry heavily, this could help these hospitals to be able to recognize risks and remove them before they significantly affect their patients.

Furthermore, this could also significantly help rural healthcare initiatives as it would allow for healthcare professionals to monitor patients in real-time in order to help them and also predict conditions even when they are not anywhere near a hospital. Not only would this lower the amount of diseases and deaths in rural America but it would also allow for healthcare professionals to provide affordable healthcare to individuals unable to pay for current insurance prices likewise.

Therefore, by using predictive data analytics in the healthcare world, the industry could ultimately lower the amount of rural and uninsured deaths yearly, as well as provide more informed preventive care as well and remove the risks of data breaches likewise, which would ultimately lead to a more efficient, affordable, and secure healthcare system for all.

In the end, there are many ways that professionals in various industries are currently utilizing big data analytics to their benefit, but the opportunities still remain endless, and, perhaps in time, these analytics could not only help to make our world a more efficient and affordable place but also help us to predict and formulate a brighter future for our children and the world of tomorrow as well.

The post 5 Aspects of Modern Society Now Using Data Analytics To Their Benefit appeared first on Big Data Analytics News.



via Big Data Analytics News http://ift.tt/2tVbIj3

27/7/17

Using Python to Drive New Insights and Innovation from Big Data


Data science and machine learning have emerged as the keys to unlocking value in enterprise data assets. Unlike traditional business analytics, which focus on known values and past performance, data science aims to identify hidden patterns in order to drive new innovations. Behind these efforts are  the programming languages used by data science teams to clean up and prepare data, write and test algorithms, build statistical models, and  translate into consumable applications or visualizations. In this regard, Python stands out as the language best suited for all areas of the data science  and machine learning framework.

In a recent white paper “Management’s Guide – Unlocking the Power of Data Science & Machine Learning with Python,” ActiveState – the Open Source Language Company – provides a summary of Python’s attributes in a number of important areas, as well as considerations for implementing Python to  drive new insights and innovation from big data.

When it comes to which language is best for data science, the short answer is that it depends on the work you are trying to do. Python and R are suited for data science functions, while Java is the standard choice for integrating data science code into large-scale  systems like Hadoop. However, Python challenges Java in that respect, and offers additional value as a tool for building web applications. Recently, Go has  emerged as an up and coming alternative to the three major languages, but is not yet as well supported as Python.

In practice, data science teams use a combination of languages to play to the strengths of each one, with Python and R used in varying degrees. The guide includes a brief comparison table highlighting each language in the context of data science.

Companies are not only maximizing their use of data, but transforming into ‘algorithmic businesses’ with Python as the leading language for machine learning. Whether it’s automatic stock trading, discoveries of new drug treatments, optimized resource production or any number of applications  involving speech, text or image recognition, machine and deep learning are becoming the primary competitive advantage in every industry.

Companies are transforming into 'algorithmic businesses' with Python as the leading language for machine learning. Click To Tweet

In the complete white paper, ActiveState covers:

  • Introduction: the Big Data Dilemma
  • Python vs. Other Languages
  • Data Analysis with Python
  • Machine Learning with Python
  • Recommendations

To learn more about introducing Python into your data science technology stack,  download the full white paper.



via insideBIGDATA http://ift.tt/2ueDR7y

Scality Launches Zenko, Open Source Software To Assure Data Control In A Multi-Cloud World

Scality, a leader in object and cloud storage, announced the open source launch of its Scality Zenko, a Multi-Cloud Data Controller. The new solution is free to use and embed into developer applications, opening a new world of multi-cloud storage for developers.

Zenko provides a unified interface based on a proven implementation of the Amazon S3 API across clouds. This allows any cloud to be addressed with the same API and access layer, while storing information in their respective native format. For example, any Amazon S3-compliant application can now support Azure Blob Storage without any application modification. Scality’s vision for Zenko is to add data management controls to protect vital business assets, and metadata search to quickly subset large datasets based on simple business descriptors.

We believe that everyone should be in control of their data,” said Giorgio Regni, CTO at Scality. “Our vision for Zenko is simple—bring control and freedom to the developer to unleash a new generation of multi-cloud applications. We welcome anyone who wants to participate and contribute to this vision.”

Zenko builds on the success of the company’s Scality S3 Server, the open-source implementation of the Amazon S3 API, which has experienced more than 600,000 DockerHub pulls since it was introduced in June 2016. Scality is releasing this new code to the open source community under an Apache 2.0 license, so that any developer can use and extend Zenko in their development.

With Zenko, Scality makes it even easier for enterprises of all sizes to quickly and cost-effectively deploy thousands of apps within the Microsoft Azure Cloud and leverage its many advanced services,” said Jurgen Willis, Head of Product for Azure Object Storage at Microsoft Corp. “Data stored with Zenko is stored in Azure Blob Storage native format, so it can easily be processed in the Azure Cloud for maximum scalability.”

Zenko Multi-Cloud Data Controller expands the Scality S3 Server, and includes:

  • S3 API – Providing a single API set and 360° access to any cloud. Developers want to have an abstraction layer allowing them the freedom to use any cloud at any time. Scality Zenko provides a single unifying interface using the Amazon S3 API, supporting multi-cloud backend data storage both on-premises and in public cloud services. Zenko is available now for Microsoft Azure Blob Storage, Amazon S3, Scality RING and Docker and will be available soon for other cloud platforms.
  • Native format – Data written through Zenko is stored in the native format of the target storage and can be read directly, without the need to go through Zenko. Therefore, data written in Azure Blob Storage or in Amazon S3 can leverage the respective advanced services of these public clouds.
  • Backbeat data workflow – A policy-based data management engine used for seamless data replication, data migration services or extended cloud workflow services like cloud analytics and content distribution. This feature will be available in September.
  • Clueso metadata search – An Apache Spark-based metadata search for expanded insight to understand data. Clueso makes it easy to interpret petabyte-scale data and easily manipulate it on any cloud to separate high-value information from data noise. It provides the ability to subset data based on key attributes. This feature will be available in September.

Application developers looking for design efficiency and rapid implementation will appreciate the productivity benefits of using Zenko. Today, applications must be rewritten to support each cloud, which reduces productivity and makes the use of multiple clouds expensive. With Zenko, applications are built once and deployed across any cloud service.

Cityzen Data provides a data management platform for collecting, storing, and delivering value from all kinds of sensor data to help customers accelerate progress from sensors to services, primarily for health, sport, wellness, and scientific applications,” said Mathias Herberts, co-founder and CTO at Cityzen Data. “Scality provides our backend storage for this and gives us a single interface for developers to code within any cloud on a common API set. With Scality, we can write an application once and deploy anywhere on any cloud.”

 

Sign up for the free insideBIGDATA newsletter.



via insideBIGDATA http://ift.tt/2ukwLOQ

AI Suggests Recipes Based on Food Photos

There are few things social media users love more than flooding their feeds with photos of food. Yet we seldom use these images for much more than a quick scroll on our cellphones. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) believe that analyzing photos like these could help us learn recipes and better understand people’s eating habits. In a new paper the team trained an AI system to look at images of food and be able to predict the ingredients and suggest similar recipes. In experiments the system retrieved the correct recipe 65 percent of the time.

In computer vision, food is mostly neglected because we don’t have the large-scale data sets needed to make predictions,” says Yusuf Aytar, a postdoctoral associate who co-wrote a paper about the system with MIT professor Antonio Torralba. “But seemingly useless photos on social media can actually provide valuable insight into health habits and dietary preferences.”

The paper will be presented later this month at the Computer Vision and Pattern Recognition conference in Honolulu. CSAIL graduate student Nick Hynes was lead author alongside Amaia Salvador of the Polytechnic University of Catalonia in Spain. Co-authors include CSAIL post-doc Javier Marin, as well as scientist Ferda Ofli and research director Ingmar Weber of QCRI.

VIDEO

How it works

The Web has spurred a huge growth of research in the area of classifying food data, but the majority of it has used much smaller datasets, which often leads to major gaps in labeling foods. In 2014 Swiss researchers created the “Food-101” data set and used it to develop an algorithm that could recognize images of food with 50 percent accuracy. Future iterations only improved accuracy to about 80 percent, suggesting that the size of the dataset may be a limiting factor. Even the larger data sets have often been somewhat limited in how well they generalize across populations. A database from the City University in Hong Kong has over 110,000 images and 65,000 recipes, each with ingredient lists and instructions, but only contains Chinese cuisine.

The CSAIL team’s project aims to build off of this work but dramatically expand in scope. Researchers combed websites like All Recipes and Food.com to develop “Recipe1M,” a database of over 1 million recipes which were annotated with information about the ingredients in a wide range of dishes. They then used that data to train a neural network to find patterns and make connections between the food images and the corresponding ingredients and recipes. Given a photo of a food item, the team’s system – which they dubbed Pic2Recipe – could identify ingredients like flour, eggs and butter, and then suggest several recipes that it determined to be similar to images from the database.

You can imagine people using this to track their daily nutrition, or to photograph their meal at a restaurant and know what’s needed to cook it at home later,” says Christoph Trattner, an assistant professor at MODUL University Vienna in the New Media Technology Department who was not involved in the paper. “The team’s approach works at a similar level to human judgement, which is remarkable.”

The system did particularly well with desserts like cookies or muffins, since that was a main theme in the database. However, it had difficulty determining ingredients for more ambiguous foods, like sushi rolls and smoothies. It was also often stumped when there similar recipes for the same dishes. For example, there’s dozens of ways to make lasagna, so the team needed to make sure that system wouldn’t “penalize” recipes that are similar when trying to separate those that are different. (One way to solve this was by seeing if the ingredients in each are generally similar before comparing the recipes themselves).

In the future, the team hopes to be able to improve the system so that it can understand food in even more detail. This could mean being able to infer how a food is prepared (i.e. stewed versus diced) or distinguish different variations of foods, like mushrooms or onions. The researchers are also interested in potentially developing the system into a “dinner aide” that could figure out what to cook given a dietary preference and a list of items in the fridge.

This could potentially help people figure out what’s in their food when they don’t have explicit nutritional information,” says Hynes. “For example, if you know what ingredients went into a dish but not the amount, you can take a photo, enter the ingredients, and run the model to find a similar recipe with known quantities, and then use that information to approximate your own meal.”

The project was funded in part by QCRI, as well as the European Regional Development Fund (ERDF) and the Spanish Ministry of Economy, Industry and Competitiveness.

 

Sign up for the free insideBIGDATA newsletter.



via insideBIGDATA http://ift.tt/2tRGDft

Why Businesses Can No Longer Ignore IoT Security

In this special guest feature, Srikant Menon, Practice Director of Internet of Things (IoT) at Happiest Minds Technologies, discusses how it is imperative for businesses to balance the massive benefits of IoT along with the security risks it poses. While millions of “things” are simple in nature, IoT security is an absolute must and should require an end-to-end approach.

via insideBIGDATA http://ift.tt/2w0ljpN

What Is Artificial Intelligence?

Here is a question I was asked to discuss at a conference last month: what is Artifical Intelligence (AI)?  Instead of trying to answer it, which could take days, I decided to focus on how AI has been defined over the years.  Nowadays, most people probably equate AI with deep learning.  This has not always been the case as we shall see.

Most people say that AI was first defined as a research field in a 1956 workshop at Dartmouth College.  Reality is that is has been defined 6 years earlier by Alan Turing in 1950.  Let me cite Wikipedia here:

The Turing test, developed by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation is a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel such as a computer keyboard and screen so the result would not depend on the machine's ability to render words as speech.[2] If the evaluator cannot reliably tell the machine from the human, the machine is said to have passed the test. The test does not check the ability to give correct answers to questions, only how closely answers resemble those a human would give.

The test was introduced by Turing in his paper, "Computing Machinery and Intelligence", while working at the University of Manchester (Turing, 1950; p. 460).[3] It opens with the words: "I propose to consider the question, 'Can machines think?'" Because "thinking" is difficult to define, Turing chooses to "replace the question by another, which is closely related to it and is expressed in relatively unambiguous words."[4] Turing's new question is: "Are there imaginable digital computers which would do well in the imitation game?"[5] This question, Turing believed, is one that can actually be answered. In the remainder of the paper, he argued against all the major objections to the proposition that "machines can think".[6]

 

image

So, the first definition of AI was about thinking machines.  Turing decided to test thinking via a chat. 

The definition of AI rapidly evolved to include the ability to perform complex reasoning and planing tasks.  Early success in the 50s led prominent researchers to make imprudent predictions about how AI would become a reality in the 60s.  The lack of realization of these predictions led to funding cut known as the AI winter in the 70s. 

In the early 80s, building on some success for medical diagnosis, AI came back with expert systems.  These systems were trying to capture the expertise of humans in various domains, and were implemented as rule based systems.  This was the days were AI was focusing on the ability to perform tasks at best human expertise level.  Success like IBM Deep Blue beating the chess world champion, Gary Kasparov, in  1997 was the acme of this line of AI research.

Let's contrast this with today's AI.  The focus is on perception: can we have systems that recognize what is in a picture, what is in a video, what is said in a sound track?  Rapid progress is underway for these tasks thanks to the use of deep learning.  Is it AI still?  Are we automating human thinking?  Reality is we are working on automating tasks that most humans can do without any thinking effort. Yet we see lots of bragging about AI being a reality when all we have is some ability to mimic human perception.  I really find it ironic that our definition of intelligence is that of mere perception  rather than thinking.

 

Granted, not all AI work today is about perception.  Work on natural language processing (e.g. translation) is a bit closer to reasoning than mere perception tasks described above.  Success like IBM Watson at Jeopardy, or Google AlphaGO at Go are two examples of the traditional AI aiming at replicate tasks performed by human experts.    The good news (to me at least) is that the progress is so rapid on perception that it will move from a research field to an engineering field in the coming years.  We will then see a re-positioning of researchers on other AI related topics such as reasoning and planning.  We'll be closer to Turing's initial view of AI.



via Planet big data http://ift.tt/2uwyBdM

Business Intelligence as a Competitive Advantage in the Retail Industry

big dataCompared to other sectors, like finance and technology, retail can be considered a late adopter of the advantages offered by business intelligence to daily processes. This is a paradox, as the operations in retail are some of the most well adjusted for the insight provided by digital dashboards.

Questions like: “Who is your ideal client?”, “What are the products you should promote?” and “Which items should you sell as a bundle?”, “What is the preferred way of paying?” and “How do your clients engage with your brand?”, can all be answered through a BI platform that integrates point of sale data with demographics and interactions from online interfaces.

Why is Business Intelligence a solution for retail?

The value of BI comes from the evolution of retail companies from organizations based on operations to companies built on innovation. The intermediate stages are consolidation, integration, and optimization. This is a journey from ad-hoc to automation, from naïve to well-defined processes that also gain a predictive dimension. Most retailers are said to be in the integration stage, where the company has enough data to make decisions based on market signals, but the vast majority are not leveraging what they have. Most companies offer three stages of business intelligence consulting, similar to Itransition: monitoring intelligence, analytical intelligence and predictive intelligence, the most important stage.

Business Intelligence is a general name for several applications that can help a company have an integrated overview of vendors, stocks, clients, payments, and marketing. This is a new approach, in contrast to the siloed way of working, specific to the pre-dot com era. A store is like a living organism. Treating each component separately prevents the organization from seeing the bigger picture and the cross-influences that could be used as profit centers.

What are the main BI applications?

Data can show a company how to size their stocks, price products based on demand and create promotions and sales targets to maximize revenue.

Costs, Prices, and Stock Management

Net profit in retail is small, therefore pricing the product to avoid losses and remain competitive is one of the greatest challenges. All costs of doing business should be taken into consideration, including unexpected situations. Decisions are based on scenario planning and making seasonally-influenced decisions.

Stock management is one of the costliest aspects of retail and BI solutions are striving to create the perfect model for optimization based on past purchases and future trends. A great app offers stock analysis, highlights the best-selling products and creates replenishment orders for these while simultaneously advising the managers to cancel orders for the worst-performing products.

Customer Analytics

Numbers help you get into the mindset of your clients, see their path from learning about your existence to becoming advocates of your quality. You need to understand the correlations between their demographic and sociographic characteristics and the content of their shopping carts. Pinpoint the link between the ads they see and the products they buy. Drill down to find out their whereabouts, payment methods, time spent in the brick and mortar store or the online retail environment. Put all this data together to create product bundles and promotions.

Vendor Management and Evaluation

Without a vendor evaluation, there is no business growth. You need to see per product results, per vendor analysis and take decisions accordingly. A BI solution should consider things like delivery time, client’s satisfaction with the return policy and even brand perception, if applicable.

Sales, Targets and Performance Analysis

In a small neighborhood store, you might want to know what the best-selling brands are and which of the sales assistants are generating more revenue, while in a multi-national corporation you may need to know which branches are meeting their quotas and which are falling short. In fact, these problems are primarily the same; only the scale is different. In this situation, BI is a great tool due to its scalability.

Additional drill-down levels can be added to an existing solution to get to the root of problems. The BI system can be the base of strategic decisions such as the product mix offered or the bonuses and promotions given to sales agents. The numbers from the dashboard also give a great estimate for setting attainable but motivating sales targets, based on the forecasts.

Trends, Forecasting, and Planning

Even retailers in the 1960s were looking at historical performance. The difference in the smart systems is that you no longer have to wait until the end of the month to do the math and see how the business is doing. A BI system performs in real-time and can dynamically adjust actions to maximize your profit. It’s like constant course correction when sailing to your destination, instead of waiting to see where the waves will take you.

Setting the Right KPIs in Digital Dashboards

Each organization has the possibility of setting their own KPIs, depending on the activity type, strategy and business proposal, but there are a few general guidelines that can be successfully used as defined by Supply Chain Operations Reference (SCOR) Level 1 metrics. These include:

  • reliability measured by order fulfillment and delivery performance;
  • responsiveness, usually expressed as time;
  • flexibility as a combination of vendor agility and product production
  • Costs
  • Asset management efficiency.

When it comes to client-related metrics, you can be inspired by the sales funnel approach to select the best metrics. These include passing from one stage to the other: measuring entry point leads, computing conversions rates, the price per conversion, the value of the conversion, the price of the average sale and the time spent in the funnel. Costs of recapturing missed opportunities through re-marketing are also necessary.

Depending on your business model, you are responsible for setting up KPIs to measure the performance of the sales agents.

Where is the retail industry heading?

The retail industry is a mature market, with low net profit rates (1.5%- 5%), where every process optimization and cost cut can mean the difference between survival and being out of business. Business Intelligence offers marketers, financial advisors, and strategists a starting point in their quest for a better understanding of clients’ needs and a better anticipation of the necessary steps to remain relevant.

The post Business Intelligence as a Competitive Advantage in the Retail Industry appeared first on Big Data Analytics News.



via Big Data Analytics News http://ift.tt/2vHLbHx

How Big Data Analytics Can Help Your Business

Big Data Predictions For 2015We all like to feel as if we have an intuitive sense of what our businesses need to succeed, but the reality is that successful companies rely on big data analytics to continuously understand, measure and improve. With powerful computing available via the cloud, and more tools and services for data collection and analysis than ever before, you can gain the edge over your competition, streamline your operations, connect with the right customers and even develop or refine the right products, using the unprecedented insights from big data. Refine your business strategy by integrating big data analytics into these five business areas:

Process improvement – work smarter

This is often one of the first areas businesses target and think of when it comes to big data analytics. Collecting data on your business process or production invariably exposes inefficiencies and opportunities. While there are often financial gains, resource use or reuse, scheduling and fulfillment/delivery are all areas that benefit from big data analysis.

Product improvement – create smarter

Beyond simply creating efficiencies in your process or production line, big data analytics used correctly will guide and refine your product or service offering. Use big data to gain insight into the market, trends and customer desires, refine your current offerings based on what best serves the company’s mission, direction and wellbeing, and test ideas early and often. This can be at the level of entirely new product lines, or it can be applied simply but effectively to invest in benefits and features of your product or service offering that bring value to your customer, while rolling back those features that bring the least value or don’t recoup their investment. As an added benefit, using big data early and often to develop and refine your product or service offering can help you identify market opportunities and promote the benefits of most interest to your customers, more coherently.

Customer acquisition – market smarter

Your marketing and advertising dollars are money down the drain if you’re not connecting with the right audience. Refine your customer acquisition process by testing every element, every step of the way. Test and drop campaigns, platforms, ads, text and photos that don’t perform well. With powerful analytics tools, you can track levels of engagement with calls to action.

For online properties such as your website, social channels and content marketing resources, dial down to granular elements like a catchphrase, Call to Action button or image. Stock photo websites, such as Dreamstime, can help resource you with high quality images to test for best audience response. Refine your images and messaging with big data analytics to reach the customers who are seeking your services, without wasting energy and resources on other audiences.

Customer retention – grow smarter

It costs five times more to gain a new customer than to retain an existing one. Slash your customer acquisition costs by using big data analytics to track what keeps your customers and what turns them away. Plan regular touch-points to seek out feedback from your customers and ask them what works, what doesn’t and what they really want. Incorporate personalization to create a feeling of connection, greater satisfaction and desire for your products or services – but be sure to include a human touch and don’t rely solely on data. Use analytics to drive direction, but people to build connection. Big data also helps when it comes to functionally achieving the improvements in service or product that your customers want.

Employee retention – collaborate smarter

The use of big data analytics when it comes to improving employee satisfaction can be of great help – but be cautious with overemphasizing data over relationships when it comes to employees or customers. While big data analytics have been touted as an excellent way to identify the right skills for the job, they run the risk of reducing employees to a list of traits and achievements, and alienating potential talent that needs a human point of contact. As in all areas of your business, use big data strategically to search for candidates, test skills and collect aggregate feedback. Support equality and fair workplace policy, but don’t use it as the sole tool in your kit. Incorporate human touch-points in the process to catch opportunities that a machine might miss.

Whatever the size or nature of your business, big data analytics will help you understand opportunities for improvement and to remain competitive. Employ big data strategically to improve your process and service or product, market it better, find and keep the best customers and get the right people on your team.

The post How Big Data Analytics Can Help Your Business appeared first on Big Data Analytics News.



via Big Data Analytics News http://ift.tt/2vLi0Dg