Archive for category Big Data

The Open Internet- what is it? Why is it important?

Let’s start from the very beginning. The Internet that we know and love today is an open channel that uses free standards for communication. It is an equal-opportunity channel when it comes to traffic/content creation and delivery. The bottleneck is primarily the bandwidth that each individual user has available.

Now let’s say there are two companies providing media sharing capabilities using the SaaS model. Today traffic to and from Company A (mega establishment) and Company B (fledgling start-up) are treated equally by the broadband service provider. Today it is illegal for the broadband provider to treat the traffic from the two companies differently (assuming similar and legal content from both companies).

Now imagine if the broadband company could tell these companies- hey if you pay me an additional fee- your traffic will get higher priority on my network! It’s easy to see how Company A could easily edge out any competition from a startup like Company B based purely on its financial advantage. Bigger and better funded companies can bury competition by simply buying out the bandwidth from the Internet providers.

This essentially can kill technological innovation.

The Openness of the Internet is not a privilege we enjoy- it is and must be a fundamental right. We have a chance now to go and tell the FCC that we want the Internet to remain a free, fair and equal-opportunity channel. The FCC is taking comments from the public on this proceeding (also referred to as preserving Net Neutrality).

The site will remain open for 120 days (starting May 15 2014).. so the clock is ticking!

File your comment here. Proceeding number is 14-28

http://apps.fcc.gov/ecfs/hotdocket/list

Net Neutrality is important- or else all the content, access and service we get over the Internet will be based on the highest bidder for bandwidth usage.

Take action. Do it now! #NetNeutrality

Advertisements

, , , , , ,

Leave a comment

The third coming of Natural Language Processing

    “SAP tried to introduce natural language processing based BI tools about five years ago and failed. Why would I use yours?”

      Yesterday I was explaining to a customer that the QuickLogix natural language query engine would make it easier for his business users to ask questions and make meaning out of their data. Being the IT Director of a multi-billion dollar company, this was a question I was expecting him to ask and he didn’t disappoint! So why indeed does Gartner project that Natural Language processing is the next big thing in the world of data analytics and business intelligence? Why- if it has been tried before- not 30 years ago- but barely 5 years ago- and it didn’t really take off then?

        It boils down to one major tech Trend in the past 10 or so years and one major Event in the past 3 years.

          The TREND
          I remember the days when the leading edge of innovation was done in the Enterprise world and the benefits bled into the consumer market. Sometime around the mid 2000s, perhaps with the introduction of the iPhone, that trend started reversing. Consumer products and requirements were on the leading edge of technological creativity. All things new and exciting in the enterprise world (cloud computing, SaaS products etc.) are dictated bleed-outs of the consumer market. More mobile devices meant more data being transferred(volume), more content being generated (variety) and more demand for quick turnaround on data accessibility and processing (velocity). Yes the familiar 3Vs of Big Data are a direct result of demands in the consumer market.

            THE EVENT
            When it comes to natural language processing, I like to think of the world as pre-Siri and post-Siri. Apple introduced Siri to the world with the iPhone4S in October 2011. Ever since, there has been a renewed focus among all other phone OS manufacturers to provide (or improve upon) a similar service. Google has been around a long time with their ground-breaking natural language search. However it is the advent of Siri that has set the average consumer expectation that all interactions- personal or otherwise- can (and should) work by using simple English.

              The Trend and the Event together have subliminally revolutionized the mindset of the workforce. More and more business specialists and users are becoming inclined to use natural language in their work. The mobile evolution will serve as a potent catalyst for the acceptance of NLP by business users in their everyday functional tasks. The challenges of training them to ask the right questions and make meaning out of the results will remain. But the adoption of the technology in itself? It was tried in the 1950s & 60s, in the late 1990s and early 2000s- but in this third coming- natural language processing is here to stay.

Leave a comment

5 things you need to know about Data Lakes

    Here are five important things to know about data lakes:

      1. What is a data lake? That’s a good place to start any conversation! A data lake is essentially a landing zone to store all the data that an organization collects. The main advantage over a traditional enterprise data warehouse (EDW) is that there is no need for extract-transform-load (ETL) processes to ingest the data from any operational systems or to access the data from the data lake itself. In addition, it is relatively inexpensive and massively scalable.

        2. Traditional EDW systems also have restrictions on the data types that they can support. All enterprise organizations today collect more data than they process. The data lake can be used to store data of any type and in any format. As a result, the cost of transforming herewith inaccessible information (such as text, images and other unstructured data) is eliminated or at least substantially reduced. What this really means for any organization is that new operational systems can be easily added into the data lake and users can start deriving insights from them almost immediately.

          3. Why isn’t everyone adopting data lakes? There are a couple of pertinent reasons. To begin with, a lot of organizations have invested heavily in the infrastructure, support and services offered by the large EDW solution providers (IBM, SAP, Oracle, Microsoft) and making a transition needs many levels of business justification. Also, the data lake technology (and the Enterprise Hadoop ecosystem) is new and evolving. As a result, early adopters will only include organizations that want to be on the cutting-edge of technological advances, those that would like to capitalize on the financial advantages of the data lake or those that are willing to hedge their bets on revolutionary solutions offered by up and coming players like QuickLogix (www.quicklogix.com. full disclosure- I am affiliated with this organization).

            4. Data governance has been a challenge with EDW systems. It is only going to gain more prominence with the advent of data lakes. Gaps in data quality and reliability will be more easily exposed. We should collectively be applauding this development. IT teams can shift their emphasis from working on ETL processes to move the data into the common store to ensuring that the data collection (operational) systems meet stringent quality standards.

              5. Data lakes are not for everyone. One of the common complaints from data architects and technologists is that their organization is simply not suited for a shift to scale-out, parallel, no-SQL systems. It is true. To dig a hole, you might just need a spade not a jackhammer. However, it is important to assess current and future technological requirements of the organization while making these choices.

Leave a comment

Did we screw up customer service in the 2000s?

When I meet with customers to explain that the QuickLogix qLSocial product will help them increase customer engagement and keep their fingers on the pulse of customer sentiment, they sit forward and want to know more. Why is this such a big deal for companies now? In the past decade or so, businesses have cumulatively sent consumers into a state of starvation when it comes to individual concerns and complaints. What were we doing wrong?

1. Removing customer service phone numbers from websites: Let’s face it- when you are in a bind, you want to tell someone the problem instead of typing out an email or filling out an online form. The one thing you definitely do not want to do is deal with this next thing…

2. Setting up long winded, automated phone support systems: Possibly the biggest annoyance conjured up by customer service experts in the past decade is the never ending automated response system. Even more so, the ones that won’t let you skip to a human by pressing the number 0 right at the beginning. And when you do press 0, you get in line as a valued customer for what seems like eternity. You have to setup a calendar appointment so you can figure out why there was that unexplained charge on your phone bill!

3. Providing scripts to customer service and sales reps: We have all been on the receiving end of the service representative who appears to simply not understand what you are telling them. I’ll never forget this mortgage company Sales rep who called me one afternoon. I was indeed looking to refinance my home but he would not stop badgering me about what was happening in my personal life that made me want to refinance. I tend to be a private person and I did not appreciate his probing questions and I said as much. He made me feel like I had a mental health issue because I would not confide in him. It occurred to me later that he was simply following his script- he could not give me numbers unless we walked through the questions he had to ask me. It wasn’t entirely his fault- but it’s an example of why profiling customers as a group is just not a smart idea.

4. Placing an unreasonable incentive on short customer service calls: When I worked as a software developer, we once had a manager who decided that measuring performance would be based on the number of issue tickets that people closed. It lead to the incredibly unhealthy and unproductive habit among the team wherein folks would create tickets for misspelt commented text. I think incentivizing customer service reps by the shortness of their calls can be similarly counter-productive.

But in all this time, we have been doing some things right. By moving so many business transactions online, creating customer logins and monitoring web-traffic and behavior- we have been setting the stage for recovering lost ground using big data. For the last decade or more, we have been priming consumers to yearn for a personal touch, a feeling of being understood as an individual and not part of a herd! Engaging with customers is the low hanging fruit on the big data promise tree. You want to at least start there!

Leave a comment

Let’s talk about the CDO- again!

    I got roped into another “the time has come” debate about the Chief Data Officer. I maintain that IMHO the CDataO is pure hype. Here’s how the roles of the existing C-suite play out:

    CEO: What we need to do
    CMO: How can we use it or apply it to better our product/image/business
    CIO/CTO: What tech strategy do we need to adopt to get there and subsequently maintain/scale
    COO: How can we implement these changes while maintaining normal business operations?

    There’s really no room or need for a data officer- because it is close to impossible for one individual to determine all the combinations in which data can be useful to the people in the company. Will creating a new CDO position magically fix data governance, reliability and accessibility issues let alone make it more meaningful all of a sudden? No way! It’s all about understanding that these issues exist and need to be addressed & fixed.

    The reality though is that I get angry and sad to see hype like this gaining momentum- it almost makes the real promise of big data seem illegitimate. I was pleasantly encouraged when Jim Stikeleather from (CIO- Dell/Perot) concurred with this remark:

    “Sharda is spot on. There should be a lead data scientist under the CInfoO, and a lead enterprise architect, and a lead security analyst, etc. The issue IMHO is that CInfraOs are neither strategic or knowledgeable enough to lead those other roles. The CMO should have the analysts (mathematical and marketing) who are supported by the data scientist. Business units / COO / etc should have their own domain experts supported by the lead enterprise architect (including enabling the orchestration and choreography among enterprise and SaaS applications). A good CInfoO understanda they are an enabler and facilitator, not an owner (or a prohibitor – which is what CInfraO tend to be).”

, , , , , , , ,

Leave a comment

It’s trendy to use #BigData @ #CES2014

Recently, I was involved in a discussion on LinkedIn on whether or not Big Data was enabling better customer service. I think the trend at the 2014 International Consumer Electronics Show (CES) indicates that Big Data is actually enabling better customer experience. If you are a more established provider of similar services and products, then you will be compelled to sit up, take notice and make changes to keep up with these leaner, smaller and providing service-in-real-time competitors. That is, if you haven’t already started to do so.

According to Forbes (read article here), the biggest disruptive trends include the use of MEMS (microelectromechanical sensors) in everything from your household appliances to wearable clothing to self-driving cars. Look past the coolness factors of these products and you cannot miss the incredible behemothian force of big data technology driving them.

The data points collected by the sensors can broadly be classified under the umbrella of semi-structured data. This gets stored in either a private or commercial public cloud based store. The product/service provider generally offers some out of the box apps. In addition, they offer authentication and access APIs for users and developers to consume freely to create their own apps (and most likely, contribute them back to the provider’s eco-system). This is the evolution of the B2C big data ecosystem. All this ties back to improved customer service if and only if the product or service provider makes an attempt to track consumer behavior, actual apps created by their customer base, user feedback, customer reviews, competitive trends. Extend this to measure the impact of all these intangible factors on company revenue, profitability and product planing- now you’ve just made the case for an enterprise big data analytics solution.

See http://www.quicklogix.com to learn how the qLSocial and Genie products can help your enterprise attain the goal of improved customer service and better business insights.

Leave a comment

Gimme more data strategy!

Gimme more data strategy!

I found this article on gigaom.com with an incredibly provocative suggestion- the CDO’s time has arrived! I had to stop and think about that for a bit. My first reaction was “I have to agree- this is a valid point”. Then I read the article and found that I had to reverse my position after all. Here’s my response to the author: 

“A CIO who understands the need for a proper big data strategy- technology, infrastructure, scalability, governance, accessibility- will do more for an enterprise than a new CxO title. The reality is that data is most powerful when everyone has access to it for their needs. What organizations require is for the data, information and knowledge to be available to the right people as and when they need it- without waiting on data specialists. Such an enabling-focused strategy will benefit any business in the long-term.”

, , , , , ,

1 Comment

%d bloggers like this: