data ownership

Who Owns the Data?

January 30th, 2018 Posted by Strategy 0 thoughts on “Who Owns the Data?”

It’s a tale as old as mankind. When a precious new resource is discovered, groups of people will try to stake their claim or debate over who the rightful owner should be. Data ownership is no different. Every day companies handling large amounts of data are grappling with this. Often times, the day-to-day handling of data is carried out by two, if not three, teams: IT, analytics, and sometimes marketing.

IT is the necessary safe keeper. Without a team working consistently and creatively to secure the data, the company is vulnerable to phishing attacks, malware, ransomware, etc. In addition, IT usually handles data storage, though that could change as companies become more interested in finding new uses for their data.

Customer data analytics is the star of the cast, developing applications for the data. Over time, these applications have become more and more sophisticated, employing new self-service and user-friendly tools such as AI assisted customer analytics to activate the data in real-time as customers shop online. These teams are relatively new but have become essential to a company’s long-term competitiveness.

Marketing is one of the chief beneficiaries of a company’s data infrastructure. Generally, these teams will use the insights gained from analyzing its data to create more sophisticated marketing campaigns. For example, discovering what attracted, retained or repelled customers can help the company better serve them and achieve enormous gains. This is why marketing departments across many different industries are being transformed by the emergence of big data.

In order for the data to be used optimally, it’s essential for these teams to collaborate and define the analytics which will benefit each business group. Because just as in a good play, each business group has a different (but equally important) role in contributing to the business’ success.

How does your company manage ownership over the data? Tweet at me or email me to share your thoughts and experiences.

 

Share this:
personalisation

It’s the Thought That Counts

December 8th, 2017 Posted by Analytics, Behavioral Data 0 thoughts on “It’s the Thought That Counts”

Think about the feeling you get when someone you care about gives you a gift. Surprise… delight… overall, a pleasant feeling, right? That’s because a gift is thoughtful and gives the impression that some time and effort went into providing that special gifting experience.

That’s exactly what customers are craving for from their favourite brands. Customers don’t want to feel misunderstood or overlooked. They want personalised content, offers and recommendations which surprise and delight them.

Personalisation is not some abstract concept
AI has become the standard for all winners in the online retail ecosystem.

Large, multinational, growing companies are investing more and more into personalisation each year. And the stakes couldn’t be higher during the holiday season. In 2017, Cyber Monday alone brought in $6.59 billion in online sales, surpassing all prior years. Black Friday didn’t disappoint either – there were $5.03 billion in online sales.

But these hoards of customers don’t magically appear out of a hat. In this competitive environment, retailers need to earn each customer. And a growing number of online buyers expect tailored offers and purchase recommendations from the sites they visit.

So how do you get ahead of your competition to attract customers? Two words: machine learning (also commonly referred to as ML).

Machine learning has become the standard for all winners in the online retail ecosystem: Amazon, eBay, Wal-Mart, you name it, are all utilising ML. And if you’re not (that is, if you’re looking at customer data as a static thing), you will quickly lose out.

Today it’s absolutely possible to map and predict customer behaviours based on real-time interactions. Every hover, click, scroll, and type builds a sequence of events unique to an individual user. ML, together with Data Science, helps leverage data more effectively, automatically driving insight from the data and helping organisations to understand the customer better. There are ways to use insight and model output to power the various customer experience platforms in your organization. And the earlier you implement ML, the quicker you’ll deliver relevant content and convert customers before your competition does, and the sooner you can achieve peace of mind with your marketing campaigns.

Are you getting it right with your personalisation efforts? Feel free to share your experiences with me. What’s working and what’s not? You can email me or connect with me on LinkedIn.

 

Share this:
clickstream data

Unlocking the Value In Your Clickstream Data

November 15th, 2017 Posted by Behavioral Data 0 thoughts on “Unlocking the Value In Your Clickstream Data”

In 2017, 65% of all US retail sales are involving a visit to the company’s website. And as you may have already realised, your website is one of the most important sources of data when it comes to understanding your customers’ wants, needs, and preferences. Assuming you have web analytics, this source of data gives you the opportunity to track and collect data on every single interaction a customer makes with your brand.

In-store you can track what a customer has bought or returned. On social media, you can get a view of which adverts they have seen, or which content they have browsed. But it’s only with web analytics that you can truly track every interaction, regardless if a purchase was completed.

Sure, you cannot track the mood the person was in when browsing nor what they had for breakfast. But no other channel gives you the chance to learn so much about your customers. From which pages and products someone has viewed to time spent on page and engagement with bespoke content widgets. From basket interactions to video plays and customer feedback views. It’s clickstream data capture which allows you to track every interaction with your brand.

The aforementioned clickstream audience gets us to a point where most organisations are today, leveraging their clickstream data in their chosen personalisation, DMP, and email tools. There is, however, much more that can be done with this data. You might say that you could look at a customer’s historical in-store transactions and work out what you think they might be interested in next. However, finding a segment of 100,000 people, in the last 14 days, who have viewed a certain product type without purchasing, is undeniably more powerful. I know which audience I would rather market to.

So, if 65% of all US retail sales are involving a visit to the company’s website, and if overall web share of sales typically ranges between 35% – 60%, there is no doubt about the impact your website on overall sales.

e-commerce sales increasing

How can we unlock the value of our clickstream data?

Web analytics is great. You can track demand, product interest, online sales, and even identify problem areas on your digital properties. These are just some of the critical functions your web analytics tool can do. However, such tools have their limitations.

The type of querying you can do is pretty limited, as are the types of visualisations you can create. The data science functionality, when available, is restricted to proprietary, pre-defined models. The schemas that these datasets are stored in are predefined and the type of data you can pull into these applications is generally limited to lookup data.

To really unlock the value in your clickstream data you only need to pull the data out of these applications and into a Big Data platform. When you do this, you can knock down the barriers of restrictive querying, focus in on individual customers (not cookies), and get clickstream data feeding in along with your many other data sources. Only then can you truly achieve an omni-channel view of your customer.

The 5 steps to unlocking the value in your clickstream data are:

1. Customers buy from brands, cookies do not buy from channels. Do Big Data.

Most organisations have realised that customers expect an omni-channel experience and have started (or are about to embark on) a Big Data platform project to handle this. It is of paramount importance that the clickstream data be a part of this project. Clickstream data can be tricky to work with, as Web Analytics vendors do not always provide this data in a format that is easy to use. However, with so much data on existing customers, as well as prospects, this data source is just too valuable to ignore. When the clickstream data is accurate and made available in your Big Data Platform, in a schema that makes sense, you can achieve a true, omni-channel view of how your customer is interacting view your brand and begin to understand and model how best to serve them.

clickstream data

2. Data Science. Stop ignoring 90% of your data.

When querying and segmenting, web analysts tend to focus on high-value data such as purchases, cart adds, visits to key pages in the conversion funnel, etc. You might then segment this data by some key dimensions, product types, geographical properties, first-time visitors, etc. Realistically you might be able to take these key metrics and apply 5 or 6 layers of segmentation to get to something interesting, be it an audience or a metric. Even in this scenario, you have probably ignored 90% of the data you have for each visitor included in the audience. Think of all of that data your Web Analytics tool collects with each tag fire. Then multiply this by the number of tag fires each visitor generates. You have so much information on your web visitors; why ignore it? Using data science techniques, you can begin to consider the other 90% of your clickstream data and begin to find behavioural patterns. For example, just because I have viewed a certain product or service more than 3 times, or added something to my basket before abandoning, does not mean I am ripe for retargeting. Conversely, just because I have not completed these interactions does not mean I am not interested in buying the product. The truth may lie in everything else I have done on your website. Start considering the full picture and use open source libraries when doing so.

3. Reporting, it is a business-wide problem.

Every business needs to produce reporting that tracks sales, revenue, and demand. Clickstream data is a vital component of this, particularly for tracking digital sales but perhaps most importantly as a measure of demand for your various products and services. If you do not have your clickstream data in your Big Data Platform you are left with the unenviable task of manually trying to combine separate reports into a single sales dashboard. Worse still, you may simply ignore the clickstream data and not report on your demand levels. But with your data consolidated in one place, not only can you easily produce high-value omni-channel business reports but you can also be more agile in answering any follow-up questions, with the required data in one place and ready to use.

4. Monetise your clickstream data, it is already costing you a small fortune.

Web Analytics tools are not cheap, and the more data you collect the more you have to pay. My previous points have examined optimal ways of storing, processing and querying your data, allowing you to learn more from the data you already have. Once you have accomplished this and created various scores and segments for your existing customers and prospects, you need to get this information into the applications you use to communicate with your customers. Connections between your Big Data Platform and your marketing tools (DMP, personalisation, customer contacts, and anything else you use), be it for online or offline customer communication, need to be established. At that point, you can automate the data flows and ensure customers are receiving a relevant, personalised and consistent experience with your brand, across the different channels that are available to them.

5. Automate everything you can.

Within an Analytics function, there are a limited number of people. Typically, these people are highly skilled in one or more areas of Analytics, be it querying, data science, visualisations, or data engineering. By automating as many processes as possible, you can ensure that the resources available to you are delivering new insights and are seen as a value driver rather than a potential constraint or blocker. When it comes to clickstream data, the aim is to automate the ingestion, processing, and auditing of this data in your Big Data Platform. This frees up the data engineer to work on new data sources and ensures your analysts and data scientists have up-to-date, ready-to-use data at all times. The same then applies to getting data out of your Big Data Platform and into the applications you use to communicate with your customers. Establish these data pipelines and automate these flows. Without this, you risk becoming a blocker to the business, as the time to delivery of new outbound data sends will be very long.

So, if you’re looking to boost sales and revenue, clickstream data will be your go-to source of data to find the customers to reach out to. It should also be an essential component in your businesses data strategy. If you get this data into your Big Data Platform, in a structured and usable format, and establish the required outbound data pipes, you will be able to give your analysts the tools they need to increase data value, and your marketing team will be able to maintain true, omni-channel engagement with prospects and customers.

 

References:
     – Digital Commerce 360: 60% of U.S. Retail Sales Will Involve the Web By 2017
     – Deloitte: Understanding Consumer Shopping Behavior
     – RetailNext: Brick & Mortar Vs. Online

 

 


Bal Basra

Bal Basra is a Solutions Consultant at Syntasa, assisting customers and prospects in extracting the maximum value from their big data projects. He brings tremendous knowledge of analytics products and methodologies from previous positions at Adobe and TUI. Connect with him on LinkedIn.

 

Share this:
chief data officer

6 Tips for A New Chief Data Officer

September 29th, 2017 Posted by Executive, Strategy 0 thoughts on “6 Tips for A New Chief Data Officer”

A guy walks into a bar. The bartender asks, “What’ll it be today?” The guy answers: “Gee, I just don’t know.” Pretty boring joke, right? The same letdown occurs when a new Chief Data Officer has no clear strategy for the future.

The title of Chief Data Officer has only emerged in the past few years and most CDOs today are the very first to be filling that role at their company. And because they have no shoes to fill, they tend to walk in barefoot. That is, they come in without any overarching strategy for how the company’s data should be stored, shared and exploited. BIG MISTAKE, because the earlier a company develops its data strategy, the more valuable its data will become in the long-term.

Here are 6 tips for an incoming CDO:

1. Think outside the box. Every day, companies are finding new ways to slice and dice data in a valuable way. For instance, internal data like inventory are now helping companies become more efficient. The CDO should make sure to survey all the departments to find out what kind of data is out there, just waiting to be activated.

2. Develop a working plan to define the ways in which data can unlock business outcomes. Don’t set a 1 year plan, let alone a 5- or 10-year plan as this will only get changed multiple times within the first year.

3. Determine who is in charge of what data. That could be several teams – for instance, IT could be in charge of securing it, and Data Analytics in charge of building apps over the data lake.

4. Ensure the company has a logical map for the data. The data lake is an incredibly rich resource, and why it needs to be used smartly. A clear data architecture will save time and money and facilitate the creation of apps.

5. Define a strategy for apps. This should be a central concern for the CDO. After all, the data on its own is like silver in the mine. It is useless until it gets activated. From the outset, the CDO should have a strategy on the types of apps the company should develop versus buy.

6. Start small, scale fast. Find a dataset that is easy enough to pull insights from, such as clickstream (or event) data. The company can leverage this kind of data very quickly, especially with predictive behavioral analytics.

 

If any of this resonates with you, tweet at me or email me to share your thoughts and experience with, or as, a CDO.

 

Share this:
RGPD, GDPR

Est-ce que RGPD vous gouvernera ou vous le gouvernerez?

September 20th, 2017 Posted by Français 🇫🇷 0 thoughts on “Est-ce que RGPD vous gouvernera ou vous le gouvernerez?”

Comme vous le savez, à partir de l’année prochaine, toute entreprise qui stocke des données sur les citoyens de l’Union européenne devra se conformer à un ensemble rigoureux de nouvelles règles de confidentialité des données, appelées Règlement Général sur la Protection des Données (RGPD). Le délai de mise en conformité s’approche rapidement. Les règlements du RGPD seront directement applicables dans l’ensemble des 28 États membres de l’Union Européenne à partir du 25 mai 2018.

Êtes-vous prêt pour cela?

Contexte au Règlement RGPD

La dernière mise à jour étant en 1995, ces nouvelles règles sont étendues et détaillées, et la pénalité pour non-conformité est raide: jusqu’à 4% du chiffre d’affaires annuel global de l’entreprise. Les grandes entreprises ayant une présence européenne ont observé le développement de la règle et sont, dans l’ensemble, prêtes pour la date limite de mai 2018.

Les entreprises se rendent compte également que les nouvelles règles ne s’appliquent pas seulement à leur organisation, mais aussi aux dizaines de fournisseurs tiers qui ont accès à leurs données.

La question est particulièrement délicate en ce qui concerne les fournisseurs de cloud. Par exemple, sous RGPD, un individu européen peut demander à une entreprise de détruire toutes les données personnelles qu’il peut détenir à leur sujet. Cela comprend toutes les données stockées sur ses fournisseurs de cloud, et doivent être effectuées dans les 30 jours de la demande. Cela signifie qu’une entreprise devra consacrer d’importants efforts pour s’assurer que tous ses fournisseurs de cloud peuvent également se conformer à cette échéancier.

En fait, l’un des plus grands défis de RGPD et de s’assurer que tous les partenaires de données peuvent se conformer à la règle.

Étapes à Suivre

Le Cloud Industry Forum offre un certain nombre de suggestions pour s’attaquer à cette tâche difficile:

  • Identifier l’emplacement où l’application cloud enregistre les données personnelles
  • Prendre des mesures pour le protéger
  • Signer des accords de traitement de données avec tous les partenaires du cloud
  • Limiter autant que possible les données envoyées sur le cloud
  • Limiter la façon dont les applications cloud peuvent utiliser les données personnelles
  • S’assurer que les données peuvent être effacées une fois que votre partenariat avec l’application cloud se termine

Selon un Netskope Cloud Report, une entreprise européenne unique utilise en moyenne plus de 600 fournisseurs de cloud d’entreprise. Imaginez le mal de tête pour suivre tous les partenaires de données impliqués dans le traitement de vos données. Par conséquent, il est probable que de nombreuses entreprises mettront fin à des partenariats avec des fournisseurs de cloud en raison du RGPD.

Les Avantages de Syntasa

Heureusement, d’autres options existent pour externaliser les fonctions de traitement des données. À Syntasa, par exemple, nous ne prendrons jamais possession de vos données ni les tiendrons dans nos serveurs. Notre application logicielle se trouve dans l’architecture de l’entreprise. Aucune donnée du client ne quitte la base de données de l’entreprise.

Cela signifie plusieurs choses. Tout d’abord, si vous êtes conforme, nous aussi sommes conformes. Vous ne devrez jamais nous demander d’effacer la date de naissance de quelqu’un, les informations sur la carte de crédit ou le rythme cardiaque, simplement parce que nous n’en aurons pas accès.

De plus, en utilisant un algorithme sophistiqué, nous pouvons minimiser le besoin de dizaines d’autres applications cloud. Syntasa peut également aider à traiter et à localiser les données sensibles afin que votre entreprise soit plus efficace pour se conformer au RGPD.

Nous utilisons des analyses de données comportementales pour permettre aux entreprises d’adapter leurs campagnes de marketing et d’autres interactions avec les clients en temps réel. Mais nous le faisons sans traquer l’utilisateur et sans demander leurs données sensibles. Nous comptons simplement sur le comportement en ligne de l’utilisateur pour déterminer ses préférences, et en comparant leur comportement avec le comportement de milliers d’autres utilisateurs anonymes qui étaient dans leur peau dans le passé.

Est-ce que vous ne souhaiteriez pas supprimer le travail supplémentaire de veiller à ce que vos fournisseurs de cloud soient conformes au RGPD? Pourquoi devriez-vous vous soucier des activités en dehors des limites de votre organisation?

Alors, qui est prêt à gouverner le RGPD ? Tweet à moi ou envoyez-moi un courriel pour partager votre avis sur le RGPD.

Si vous souhaitez plus d’informations pour savoir comment Syntasa peut vous aider avec le RGPD, n’hésitez pas à nous contacter.

 

Share this:
RGPD, GDPR

Will GDPR Rule You or Will You Rule It?

September 20th, 2017 Posted by Français 🇫🇷, Government 0 thoughts on “Will GDPR Rule You or Will You Rule It?”

As you’ve probably heard, starting next year any company that stores data on citizens of the European Union will have to comply with a stringent set of new data privacy rules called General Data Protection Regulation (GDPR). The deadline to comply is fast approaching. GDPR will apply in the UK, and the 28 member states of the European Union, beginning May 25, 2018.

Are you ready for it?

Background on GDPR

Last updated in 1995, those new rules are expansive and detailed. And the penalty for non-compliance is steep – up to 4% of a company’s global annual turnover. Large companies with a European presence have been watching the development of the rule and are, by and large, ready for the May 2018 deadline.

Companies are also realizing that the new rules don’t just apply to their organization, but also to the dozens of third-party vendors that have access to their data.

The question is particularly tricky when it comes to cloud providers. For instance, under GDPR a European individual can request a company destroy all personal data it may hold about them. This includes all data stored on its cloud providers, and must be done within 30 days of the request. That means a company will have to devote significant effort to ensure that all its cloud providers can also comply within that timeframe.

In fact, ensuring that all data partners can comply with the rule is shaping up to be one of the biggest challenges of GDPR.

Steps to Take

The Cloud Industry Forum offers a number of suggestions for tackling this difficult task:

  • Identify the location where the cloud app is storing the personal data
  • Take measures to protect it
  • Sign data processing agreements with all cloud partners
  • Limit the data sent over the clouds as much as possible
  • Restrict how cloud apps can use the personal data
  • Ensure that the data can be erased once your partnership with the cloud application ends

According to a Netskope Cloud Report, a single European enterprise uses, on average, more than 600 business cloud providers. Imagine the headache of tracking all the data partners involved in processing your data. Therefore, it’s likely that many companies will terminate partnerships with cloud providers as a result of GDPR.

Benefits of Syntasa

Thankfully, there are other options for outsourcing data processing functions. At Syntasa, for instance, we will never take ownership of your data or hold it in our servers. Our software application sits within the enterprise architecture. No customer data ever leaves the enterprise database.

This means several things. First, if you are compliant, we are compliant. You will never have to ask us to erase anyone’s date of birth, credit card information or heart rate, simply because we won’t have access to it.

In addition, using a sophisticated algorithm, we can minimize the need for dozens of other cloud applications. Syntasa can also help with processing through and locating the sensitive data, so that your company can be more efficient at complying with GDPR.

We use behavioral data analytics to enable companies to tailor their marketing campaigns and other customer interactions in real-time. But we do this without stalking the user or asking for their sensitive data. We simply rely on the user’s online behavior to determine their preferences, and by comparing theirs with the behavior of thousands of anonymous users who were in their shoes in the past.

Wouldn’t you want to remove the extra work of ensuring your cloud providers are complying with GDPR? Why should you have to worry about activities outside the confines of your own organization?

So, who’s ready to rule GDPR? Tweet at me or email me to share your thoughts about GDPR.

If you require more information on how Syntasa can help with GDPR, please contact us at info@syntasa.com.

 

Share this:
eric siegel

Eric Siegel on the State of Predictive Analytics

September 14th, 2017 Posted by Interviews, Predictive Analytics 0 thoughts on “Eric Siegel on the State of Predictive Analytics”

When you think of predictive analytics, which person comes to mind?

Think about it for a moment and then hold that thought…

For me, it most definitely is Eric Siegel. If you don’t already know, Eric Siegel, Ph.D., is the founder of the Predictive Analytics World conference series, the Executive Editor of Predictive Analytics Times, and author of Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die (Revised and Updated).

So when I got the opportunity to pick someone’s brain for a new interview series we’re doing, who else but Eric? Here’s what he had to share about the state of predictive analytics today.

What value does predictive analytics deliver to companies in their digital marketing efforts?

ERIC: Predictive analytics (aka machine learning) targets content, product recommendations, fraud detection, and retention efforts — in all cases, rendering these processes more effective. To get an idea of your possible upside, start with others’ case studies and then do a scratch calculation to forecast your own win. For the first of these two steps, the central insert of my book “Predictive Analytics” is a compendium of 182 mini-case studies divided into nine industry groups, including examples from BBC, Citibank, ConEd, Facebook, Ford, Google, the IRS, Match.com, MTV, PayPal, Pfizer, Spotify, Uber, UPS, Wikipedia, and more.

Where are we in the adoption curve of predictive analytics, as it relates to digital marketing?

ERIC: We don’t have the complete industry data to properly answer that question, but I would informally estimate that we’re about 5-10% where it could and eventually will be, as far as adoption, implementation, and effective deployment. Despite my low estimate, I would say the deployment of machine learning is well beyond early “Innovator” or “Early Adopter” stages. The concepts and technology/solutions are fully developed and proven. But the process to commercially integrate and deploy is not just a technical one – it is an organizational process. This is quite different from most technologies. You need to not only crunch data and derive predictive scores per individual, you then need to actually change the preexisting operational process to make use of the predictive scores, thus fundamentally changing “business as usual”.

What do you see as the biggest challenges in adopting predictive analytics?

ERIC: The greatest pitfall is an organizational/process one. The deployment of predictive analytics is not turnkey or plug-and-play. You don’t just “install” it. Rather, it is a change to organizational processes, priorities, and basic system operations. The per-individual predictions generated by this technology – such as whether an individual will click, buy, defect, commit fraud, or unsubscribe from an email list – are only valuable when acted upon (i.e., integrated into existing systems, thus actively changing “business as usual”). To that end, the project must be conceived up to and including the executive level, and there must be broad organizational buy-in, commitment, and coordination.

What are you most excited about when it comes to the future of predictive analytics?

ERIC: While core technology and software solutions are evolving in exciting ways, I’m most excited about the breadth of business applications, both across digital marketing and beyond (sectors such as financial credit risk and healthcare deploy the same core analytical technology in analogous ways). As the awareness, understanding, and comfort with deploying predictive models grow, so does its organic integration into more and more processes.


Do you know someone else working in predictive analytics? They could be featured in a future post. Tweet at meconnect with me, or email me to let me know.

Predictive Analytics World is coming to New York, London, and Berlin this fall. Don’t miss out!

You can also find Eric Siegel on Twitter and LinkedIn.

 

Share this:
retail online in store brick and mortar

Retailers Can Have Their Bricks and Click Them Too

August 30th, 2017 Posted by Analytics, Behavioral Data 0 thoughts on “Retailers Can Have Their Bricks and Click Them Too”

Despite what you might have read, retail is not dying. Sure, brick-and-mortar retailers today face significant competition from their digital counterparts. But with the right omni-channel strategy, they can outperform online-only stores by providing the best of both worlds – the convenience of online e-commerce along with the human experience of physical stores.

Last week, Macy’s announced its new President will be Hal Lawton, a former eBay and Home Depot executive who is credited with building Home Depot’s stellar interconnected retail experience. Macy’s knows that the path to sustainability involves a unified online and in-store strategy and it has plans to expand its data analytics and consumer insights.

That’s because today retail stores are sitting on an enormous mound of customer and enterprise data, which includes point-of-sale receipts, online visits and purchases, warehouse inventory, and so on. And all of these data points are extremely valuable with the right data analytics strategy and technology in place.

In particular, predictive behavioral analytics has allowed retailers to know when to do what and where. As a result, a store can maintain optimal inventory levels and anticipate what a customer will want to look at on their next visit. It can also pair up a customer’s in-store and online activities to ensure a seamless customer experience and optimal conversion rate with each visit. Imagine, a sales representative having the most up-to-date customer information at their fingertips to help the customer determine the next best action.

It is these kinds of capabilities that will allow companies to stay relevant and win big.

If any of this resonates with you, tweet at me or email me to share your thoughts and experience with analytics.

 

Share this:
data

Data That Stays Together, Works Together

July 31st, 2017 Posted by Strategy 0 thoughts on “Data That Stays Together, Works Together”

When it comes to data analytics, marketers are missing the forest for the trees. If you think your company’s most marketable data source lies in your enterprise data, think again. Your company is sitting on a gold mine of customer data, siloed in different departments, just waiting to be integrated and activated.

According to eConsultancy’s 2017 Digital Intelligence Briefing on Digital Trends, 59% of marketers who have an intermediate or advanced understanding of the customer journey stated that they had trouble unifying different data sources.

On the front end, you may have your clickstream data, which can include activity from ad displays, social media and email campaigns. Some companies even have data on the voices of their customers and that’s a real trove for piecing together customer demographic profiles. Companies also have loads of data on the back end, waiting to be mined, and this includes margin data, CRM product data, and enterprise resource planning, among others. Combined, both front end and back end data can turbocharge your data analytics system.

But in order for this to happen, the data needs to be removed from its silo and made accessible in a central behavioral data repository. Everything under one roof and one program to rule them all.

A predictive behavioral analytics platform can take almost any type of data sitting in your data lake and turn it into gold. It does not require you to manually identify every single data point across different departments because machine learning algorithms do the work for you. An individual data point teaches the model something completely new, regardless if it is tied to other data points in the dataset or not. As the number of data points coming in to the central behavioral data repository grows, the algorithm’s predictions on user behaviors become more and more accurate. Therefore, companies which have started activating their data are gaining the edge needed to secure their spot as a market leader for tomorrow. The sooner you “compound”, the greater the benefit.

Are you already doing something similar? Tweet at me or email me to share your experiences.

 

Share this:
analytics applications

Are Homemade Predictive Behavioral Analytics Applications Better?

July 17th, 2017 Posted by Analytics, Strategy 0 thoughts on “Are Homemade Predictive Behavioral Analytics Applications Better?”

Homemade apple pie beats the Entenmann’s variety, right? Well, only if there’s a good cook at the helm. The same principle holds true when building analytics applications. With an advanced analytics solution, IT teams can, for example, tailor their websites to a visitor in-real-time, delivering the right content at the right time. And your team may be perfectly capable of building an infrastructure and enterprise application.

But what happens when you have to build for a more specialized function, such as predictive behavioral analytics? More often than not, the necessary skill sets for building these kinds of applications fall outside the scope of your team. You’ll know at the time of deployment just how tall this order will be for your company’s IT coder. They will declare victory after they have created a number of scripts, yet still there is no product.

That’s why large organizations – including government agencies – get an advanced behavioral analytics solution to sit on top of their open-source big data stack. It’s that lattice top that every homemade apple pie needs. In this way, organizations have complete control over their data, proprietary models and implementation. There is no confidential or proprietary data that ever needs to be shared with an outside vendor. And with an open-source framework, your data scientists have the freedom to build models on top of it if they so wish.

So, the question is which applications should you build in-house and which you should buy. Enterprise analytics applications are designed for enterprise IT to build in-house, whereas predictive behavioral analytics applications are acquired, as long as it is the right one – one that can sit on your data lake and is not a “black box” of proprietary technology.

Do you have a similar experience? Tweet at me or email me.

SHARE THIS ON: Twitter     LinkedIn     Facebook     Email

Share this:

Copyright 2018 SYNTASA®. All Rights Reserved.