Read the whole thing.
Filed under: Research, Technology Tagged: enterprise, networks, privacy, research, security
Several years ago, I first heard Doc Searls make an amusing comment about one of the basic elements of the internet universe, the browser cookie. With full credit to Phil Windley, Doc’s historical summary of ecommerce (and much of the modern internet) went like this:
A brief history of ecommerce can be summarized as this- 1995: The invention of the cookie. The end.
The browser cookie has reigned supreme for nearly two decades. It has given rise to marketing empires like Double-Click (Google), Omniture, and nearly every imaginable advertising network of the modern web. Cookies also provide context beyond ecommerce, since they help sites fine-tune the user experience and reduce friction for end users.
Cookies have become so pervasive that a contextualized web with out them would not be possible. They’ve also extended well beyond context, as most cookies now actively track internet users, often without explicit permission. With that backdrop, it’s hard to imagine that this atomic element of today’s web may soon fade away.
Perhaps because of how pervasive it is, and how invasive it is to personal privacy, the browser cookie is now under assault on many fronts. The Europeans have taken to legislation as the primary vehicle to act against personal tracking technologies like cookies, Microsoft has gone as far as to ‘default‘ a do-not-track feature with their latest version of Internet Explorer, and there are at least a dozen such plugins for Firefox and Chrome. Some ad-tech experts are actually predicting the complete collapse of the browser cookie in five years:
Five years at the most.
At my former company, my peers were the people who created cookies. We didn’t create them for this. It’s a very weak computing mechanism. It’s flawed, invasive, it’s got privacy issues, it’s going to go.
I think it will take five years to kill it. At that point, it’ll be like birds chirping and flowers blooming because we’ll find some kind of value proposition that allows consumers to trust us and opt into personalization. I term it, tailor don’t target.
via - The cookie has five years left says Merkle’s Paul Cimino | Ad Exchanger
It’s no surprise that ad-tech professionals see a paradigm shift away from cookies, but that shift isn’t being driven by a direct attack on the technology. I can’t imagine that the ‘average’ internet user is proactively installing browser plugins to block cookies, so there has to be another reason why cookie usage has dropped precipitously. At a prior point in the same blog post, Cimino reveals:
The second main reason is that non-cookieable devices – phones and iPads, Kindles and the like – are generating traffic somewhere between 35% and 40% of our overall traffic. So 35-40% of traffic is not from computers.
Consumer behavior has shifted away, which is forcing a shift away from cookies. Although this might seem as a ‘win’ for privacy, the ad-tech world has figured out even more invasive ways to target consumers:
I can’t cookie your iPhone or your Android phone. If you are at home or you go to the same place every day, I can see the IP and part of the user agent – enough information to reasonably identify you over and over and keep that good sync between the data – the first- and third-party data and the targeting opportunity that’s out there.
The takeaway here is that, as we see the value of cookies corroding, the technological fabric that has woven the modern web has produced even more invasive methods to track individual behavior. At the same time, legislation and technology to counteract tracking technology is focused on the old cookie paradigm. While the new tracking systems are relatively new, perhaps there is a window of opportunity for consumers to help shape a more balanced framework.
It is this balanced framework, that we are focusing on developing at Customer Commons:
Customer Commons holds a vision of the customer as an independent actor who retains autonomous control over his or her personal data, desires and intentions. In this vision, each of us will act as the optimal point of integration and origination for data about us. Customers must be able to share their data and intentions selectively and voluntarily. Individuals must also be able to know exactly what information is being held about them by those who gather it, by whatever means. To achieve this, customers must be able to assert their own terms of engagement, in ways that are both practical and easy to understand for all sides.
I encourage you to join the conversation at Customer Commons. Additionally, I will be devoting more time writing about how customer engagement in a modern marketplace will be significantly different, and how we call all help to shape that future, and more free, market.
If you are in the bay area during the week of May 6th, 2013, please consider joining the Customer Commons Salon that Monday evening.
Filed under: Customer Commons, Customer experience, VRM Tagged: big data, browser cookies, Customer Commons, customer experience, CX, ecommerce, privacy, VRM
There are several interactive charts on that post, all of which reveal some interesting characteristics on how customer interactions vary based on the channel of engagement, by industry and region.
Filed under: Customer experience Tagged: CRM, customer experience, CX, Google, journey mapping
Filed under: Knowledge Management, Science, Technology Tagged: artificial intelligence, Google, Knowledge Management, Technology
Filed under: Science Tagged: evolution, life
Inference, particularly with large data sets, and disparate solution criteria, is one of the tougher challenges of current computing models. Probabilistic computing may unlock an alternative approach to tackling complex problems by enabling systems to infer solutions that lie outside the current linear computational models:
Probabilistic programming languages are in the spotlight. This is due to the announcement of a new DARPA program to support their fundamental research. But what is probabilistic programming? What can we expect from this research? Will this effort pay off? How long will it take?
A probabilistic programming language is a high-level language that makes it easy for a developer to define probability models and then “solve” these models automatically.
Bonus: The video at the bottom of the linked blog post serves as an excellent overview of where this technology is headed
Filed under: Research Tagged: DARPA, probabilistic computing
Just under the radar, there’s been a lot of activity in the ProjectVRM space of late. Various clusters of work are underway in the VRM space, including identity research and personal data store development. On the latter, Phil Windley has an excellent post explaining the framework in which personal clouds should operate by referencing the tried, trusted and true technologies around the IMAP protocol:
In short, email was designed with the architecture of the Internet in mind. Email is decentralized and protocol-mediated. Email is open—not necessarily open-source—but open in that anyone can build clients and servers that speak IMAP and SMTP. As a result, email maximizes freedom and control for the user and minimizes the chance of disruption. The features and benefits that email provides are exactly the same as those we want for personal clouds. Designed right, any application built on a personal cloud would provide similar functionality.
Web 2.0 has given us a model that is exactly the opposite of email. The model encourages user data to be stored in separate silos. You cannot easily migrate from one service provider to another. And when a service provider goes away, you are abandoned and marooned. You are not in control. Of course, it doesn’t help that this is all in the service provider’s best interest. They make money from the fact that the predominant model for building online applications leaves their users powerless.
There’s lots of activity underway in this space. I’ll have my own thoughts in several subsequent posts.
Filed under: Personal Data Stores Tagged: IMAP, personal cloud, personal data, personal data store, VRM
Conducting a census of India is a monumental task. The last such undertaking happened in 2011 [wikipedia]. While the raw data reveal, well, raw statistics, delving deeper into census data is a fascinating exercise. On that note, I recently wandered upon a new weblog that is devoted to extracting insights from India’s last census. The author, an anonymous reporter based in Delhi, has pulled some fascinating revelations. One recent post looks at an approach to identify the clusters of wealth by districts across India:
We could start with the fact that only around 42,800 people in the country admit to an income of Rs 10 million or more to the income tax department. But almost everyone, the finance minister included, thinks that figure is laughably low. Here I want to talk about a more er…inclusive definition of the privileged.
Take a look at the map below. It maps the proportion of households in each district, who told census-takers that they own all of the following – a TV set, a phone, a computer and a vehicle (scooter/motorcycle or car). That number, for the country as a whole, is 4.6% (roughly 11 million households).
I leave you to draw your own conclusions about what it means to be ‘privileged’ in this country. I also leave you with this question: If the census takers had asked each one of these households, what ‘class’ of society they thought they belonged to, or where they fit in within the income distribution, what do you think their response would have been ( and by ‘their’, I also mean ‘our’)?
Post: We are the 5%
In another set of posts, the author dives into older census data to reveal that per capita income divergence between India and the west (especially the US) peaked at an unexpected time, the close of the ’70s:
In 1979, the difference between American and Indian per capita incomes peaked and India began a period of catch-up with not just the US, but with the West in general which continues today. But that year also set in motion another divergence – between India and China which also continues.
Each post on Data Stories provides another useful lens to apply on India’s census data. I know I’ll be following Data Stories to see what else the census data reveal.
Filed under: Big Data, India Tagged: census of india, census takers, raw statistics
An excellent post on big data and the customer experience over at the Harvard Business Review blog. Of note:
Expand the Value You Create for Customers
Filed under: Big Data, Customer experience Tagged: big data, CLV, customer experience, HBR
Having tried nearly every to-list, task manager over the years, I think I’ve finally found one that works for me with Trello. I have stuck to it as my go-to application for managing a wide array of both personal activities and collaboration across groups. Many of the current generation of team collaboration/task management tools provide great flexibility, but the user experience curve is still too high to quickly bring a disparate group of tasks and people together. This is where Trello shines; it is a deceivingly simple application that provides significant horsepower behind the scenes. Trello uses a skeumorphic approach to managing activities, relying on a time-tested approach of ‘boards’ that contain columns of movable ‘cards’. I’ve seen people online refer to this approach as being similar to the Japanese Kanban process used in manufacturing, which I suppose is the inspiration for the application. I can’t really do justice to how the application works here, so I suggest you visit their home page and take a tour.
The main browser-based application (there are iPhone and Android companion apps) displays some of the best web coding I’ve seen. Fog Creek Software, the developer of Trello, is providing enterprise-class horsepower with a consumer level user experience, which is not an easy feat.
Earlier this year, Joel Spolsky, CEO of Fog Creek Software, wrote that Trello was designed to be used by a wide variety of people:
The biggest difference you’ll notice … is that Trello is a totally horizontal product.
Horizontal means that it can be used by people from all walks of life. Word processors and web browsers are horizontal. The software your dentist uses to torture you with drills is vertical.
Vertical software is much easier to pull off and make money with, and it’s a good choice for your first startup. Here are two key reasons:
- It’s easier to find customers. If you make dentist software, you know which conventions to go to and which magazines to advertise in. All you have to do is find dentists.
- The margins are better. Your users are professionals at work and it makes sense for them to give you money if you can solve their problems.
Making a major horizontal product that’s useful in any walk of life is almost impossible to pull off. You can’t charge very much, because you’re competing with other horizontal products that can amortize their development costs across a huge number of users. It’s high risk, high reward: not suitable for a young bootstrapped startup, but not a bad idea for a second or third product from a mature and stable company like Fog Creek.
via Joel on Software
Fog Creek is aggressively developing the application, and has recently updated the Trello iPhone companion app (I can’t wait to see a native iPad app!). So, a year in with a horizontal product, Trello marks the milestone with a great statistic:
You’ve made 717,337 accounts. We hit 500,000 in July, so it’s going even faster these days.
Congratulations to the Trello team for a successful year!
One of these days I may post about my own workflows using Trello, but in the meantime I encourage you check out the application for yourself.
Filed under: Uncategorized Tagged: collaboration, enterprise software, Fog Creek Software, Kanban, task management, task managment, trello
During a recent conversation I had with Mark Angel [founder of Knova Software and most recently the CTO at Kana], he was quick to point out that the ‘spreadsheet’ stage of cloud computing had yet to arrive. His point was that most of the computational horsepower of the cloud was still largely relegated to the technical elite inside organizations, and end-users had limited options on how cloud data were interpreted. Data, therefore, are frequently interpreted out of context, and far removed from the impacted business process. Nearly a generation ago, spreadsheets altered the corporate landscape by empowering end-users to manipulate data based on their expertise, unleashing an entirely new way of extracting meaningful insights from data. To Mark’s point, for many enterprises, that stage of cloud computing has yet to arrive.
The best opportunity for this ‘spreadsheet’ stage to take hold is in the white-hot field of big data. While capturing and storing data has never been cheaper, the opportunity to extend the ability to interpret this data to vast armies of knowledge workers has been limited. It was in that context that I found this morning’s announcement by GoodData of their Bashes to be an interesting development:
We call our apps “Bashes” — for business mash-ups — because they combine the best elements of consumer apps with modern, enterprise-class technologies. That means consumer apps’ clean and intuitive user interface, ease of use and device independence, with cloud-based business technologies that collect and manage structured and unstructured data from hundreds of sources. With Bashes, businesses can discern meaning from all the data flooding in from emails, social media, enterprise software and cloud apps.
Clearly there’s an opportunity to give today’s knowledge worker a spreadsheet-like environment to mash-up disparate data sets on the fly. It looks like GoodData’s positioning their platform, and Bashes, to be one of the spreadsheets for this generation’s knowledge worker.
Tagged: bashes, big data, Cloud Computing, GoodData, knowledge worker, spreadsheets
David Kay is one of the best knowledge management consultants (if not best consultants) I’ve known. As anyone who knows him will attest, David’s one of the few consultants that ‘gets it’. Real change in organizations, particularly with knowledge management, is not as much about technology implementation as it is about process transformation. And, process transformation rarely happens unless there is an organizational culture that is amenable to change.
Over at his blog, David has an excellent post about warning signs he’s seen in corporate cultures over the last decade of consulting. While he’s focused on the knowledge sharing process, the post could apply to just about any corporate transformation effort. Representing the technology vendor’s point of view in many instances, I found myself nodding with each of his eleven points, especially point number eleven:
11. A lousy work environment, food service, and coffee. Look, we’re not all going to work in the Googleplex with free gourmet lunches and company-branded ice cream treats. But we spend a lot of time at work, and our mental state there matters, and our heads are influenced by our environment. (There’s a reason they spent so much time building cathedrals in the Middle Ages.) If I go to an office building that’s dingy, dreary, sterile, and cut off from natural sunlight, I know something. If the coffee service comes out of 1950s-style glass carafes and hotplates with generic pre-ground beans in foil packets, I know something. If people resign themselves to the depressing burger-and-fries or meatloaf options at the cafeteria, I know something. And if the company hasn’t spent the money for decent computers, double monitors, comfortable ergonomic chairs, and IT that works, I really know something. I know the company doesn’t really care about the employees, no matter what they say, and it’s going to be wickedly hard to get the team excited about taking on a new challenge.
The cafeteria comment made me laugh as I was reminded of a large company cafeteria I visited where there was a five cent up-charge to use plastic utensils (for EACH utensil). You can only imagine what the morale level was at that now defunct large company.
Check out David’s post, I’m certain it’ll remind you of places you’ve seen as well.
Tagged: Consulting, DBKay, KM, Knowledge Management
Seems a lot like what Guragaon, the mega-suburb of Delhi has evolved into:
There’s lots of talk about optimizing the customer experience from a process perspective, but not much conversation from a pricing perspective. Pricing, as the article I link to below, is more than building in profitability above product or service costs. Achieving an ‘optimal’ price requires deeper analysis than most companies actually do. According to the Sloan Review piece, fewer than 5% of Fortune 500 companies have a full-time pricing function, and less than 15% of companies do systematic research on pricing. That, to me, was surprising. Here’s a clip from the article:
How could companies go about rethinking their pricing strategy? The first area that may require a fundamental rethink is the way companies set prices. Many companies have a significant opportunity to differentiate themselves from competitors by learning how to create, quantify, communicate and capture customer value by implementing customer value-based pricing strategies. A second area concerns price realization — that is, the process of translating list prices into profitable pocket prices. Here, many companies lack the information systems, negotiation capabilities, incentive schemes, controlling tools and sales personnel confidence leading to superior price realization. Small improvements in any of these areas lead to quantifiable results very quickly. (See “Next Steps for Improving Pricing Capabilities.”)
CEO involvement is a critical requirement for ensuring that changes in a company’s pricing strategy lead to a true change in the company’s culture. At the same time, the CEO must ensure that these changes are not seen, as too many failed initiatives are, as “just another project.” CEO championing, bundled with organizational confidence, new capabilities and transformational change are key catalysts to obtain pricing power.
A definite must read piece.
Tagged: customer experience, pricing
It’s hard to imagine that a decade has passed, yet it seems just like yesterday. Ten years ago, on this day, I boarded an early morning flight from Pittsburgh to New York’s LaGuardia airport. I was beginning my weekly travels across the northeast a day late, delaying the routine Monday departure for Boston to be home for an extra day. Back then, a decade ago, I was working for Siebel Systems, and was overseeing one of the largest software deployments on the east coast. Like most financial services firms, our client had sprawling operations, with headquarters in Boston, but ongoing Siebel deployments dotted the northeast landscape. I usually started the week in Boston, then would work my way through Providence, Hartford, Buffalo, and New York City before heading back home to Pittsburgh. Because I had hastily changed my plans, I couldn’t get onto the direct flight to Boston from Pittsburgh, so I decided to fly into LaGuardia and jump onto the first available Shuttle flight from there upto Boston.
Following a well worn routine, I cruised into the Pittsburgh airport with just 35-40 minutes to departure, knowing that clearing security as a frequent flier was just a formality. I hadn’t booked my flight until the last minute, so I wasn’t able to get the automatic upgrade to first class, but I did manage to score a bulkhead row seat just behind first class. We departed Pittsburgh on time, and everything seemed routine. As we got up to cruising altitude, the flight attendants went about their morning rituals of handing out drinks, coffee and peanuts throughout the cabin. Nothing seemed out of the ordinary until around the New Jersey border. As many frequent fliers know, when the pilot pulls the throttle back on the engines, it’s usually the first sign that the control tower has requested a holding pattern. It was around the New Jersey border, that the engines were throttled back, and the guy next to me said, almost instinctively, “Oh no, I’m going to be late for my meeting in mid-town.” I smiled and replied, “yup, looks like delays into LaGuardia.”
While we were caught up in our own world, we hadn’t noticed that the flight attendants had suddenly disappeared. They hadn’t come back through the cabin, or to pick up trash. Later that day, once on the ground, it finally struck me that the pilot probably had informed them of the horror unfolding in New York, and emotions might have gotten the better of them. A few minutes after throttling back, the pilot came on over the air and said something like, “Well folks, there has been a problem in the New York area and we’re being asked to reroute further over the Atlantic to make our approach into LaGuardia. From what we understand, there is a major fire in lower Manhattan.” That seemed a bit odd, but LaGuardia has flight patterns that cross over large populated areas, so a ‘major fire’ could have meant anything.
As we approached lower the New York area, most of the plane could see smoke billowing out of what looked like one of the World Trade Center Towers. Someone behind me joked, “That’s not a copier fire for sure.” When we got closer, we could see both towers were smoldering. It didn’t make sense. From the air, it was hard to fathom what might have caused both towers to catch fire like that.
After doing a large sweep across the Atlantic, we descended onto LaGuardia’s active runway and quickly parked at the gate. The flight attendants never came through for their routine landing checks. Upon getting to the gate, one of the attendants, with tears in her eyes and visibly shaken, opened the plane door and we quickly shuffled out into a chaotic scene at LaGuardia. I turned on my cell phone and noticed over a dozen missed calls and messages. My first instinct was to call home, but the phone circuits were jammed. While I kept trying to call, an announcement came over the loudspeaker at LaGuardia, “LaGuardia is shut down. LaGuardia is shut down.”
I finally got through to my wife, and let her know I was alright. The family, had been in shock, watching the scenes from New York, and knowing that I was on a plane bound for LaGuardia. The phone line got cut off mid-way through the call, so I decided to run downstairs to the taxi stand and see if I could hail a taxi to, frankly, anywhere. LaGuardia didn’t seem to be the place to be at that moment. As I ran down the steps, I saw one empty cab, and waived at the cabbie to come over. He waved back and yelled, “I’m done today, going home.” “Which way are you headed?” I asked. He said, “Jamaica,” as in Jamaica, Queens. I yelled back, “can you drop me off in Forest Hills?” He reluctantly nodded and I ran over and jumped in. I blurted out Forest Hills almost instinctively, as I knew it was between LaGuardia and Jamaica, and my cousin lived there.
In his cab, the cabbie was crying. The radio was on, and of all the people to have dialed in he had Howard Stern running. Stern was mumbling something about two airliners having hit the World Trade Center, a third having hit the Pentagon, and something about the Sears Tower in Chicago. To Stern’s credit, soon after, he essentially turned his broadcast over to ABC News. The cabbie and I listened in shock to Peter Jennings describe what was happening across the country.
The cabbie turned around to me, as we got onto the Brooklyn-Queens Expressway, and asked where to in Forest Hills. I gave him my cousin’s address, and when we got there he refused to take the cab fare. He just said, “Be well, man” and took off. He had dropped me off at the intersection of Queens Boulevard and Yellowstone. As I made my way over to my cousin’s place, I heard a fighter jet scream across the sky above. That’s when it really struck me that this was going to be a day never to forget.
I got up to my cousin’s place, and out her south facing window, we could see the towers smoldering. The intensity of the smoke had reached a point where the top of either building was not visible. She had the TV on, and we kept switching between channels to learn as much as possible as the morning wore on. We were having problems with both landline phones and cell phones, but the internet connection was working. I logged into my email account and fired off emails to my relatives and friends. I then logged into my Siebel email address, and it was flooded with messages from across the company.
Working in the Financial Services practice at Siebel, I knew that many of my friends and colleagues were in lower Manhattan on assignments. As fate would have it, many Siebel employees where in and around the towers that day, but all of them were able to escape in time. Other friends of mine would not be so fortunate.
Prior to working at Siebel, I had been a part of the Oracle Financial Services practice in New York. In those days I lived in the New York area, in fact just across the river from the World Trade Centers in the Newport area of Jersey City. The Oracle financial practice had been booming across North America when I joined, but I had been promptly told by the practice lead that there would likely be no travel beyond Manhattan since the biggest projects, and biggest financial institutions, were all a subway ride away. Coming on board at roughly the same time was Ken Zelman, a native of New Jersey and one of the hardest working guys I ever met. Ken and I worked on a few small accounts together, but really established ourselves working on a massive project at Merrill Lynch. We spent the better part of a year at Merrill’s World Financial Center building across from the WTC, and at 100 Church Street, which was also next to the towers. It was a year in, and around, the towers. Through the long hours there, we got to know the street food, restaurants and rhythms of the towers quite well. I eventually left for Siebel, but Ken continued to flourish at Oracle. We stayed in touch after I left, and I even tried to coax him to come join me at Siebel. He was doing well at Oracle, and more importantly, had a job that kept him at home. Once the Merrill project completed, he naturally took the Marsh opportunity inside the towers.
A few months before that day, I called Ken, just to catch up on things. He was excited to hear about the growth at Siebel, but less interested in the amount of traveling I was doing. He said that the Marsh ‘gig’ was great because the NJ Transit Bus from Central New Jersey stopped right in front of the Towers, which made his commute a breeze. We even joked that I would have probably been at Marsh with him had I stayed at Oracle, given that I was just one PATH stop away from the towers.
As that Tuesday wore on, we saw the first tower collapse from my cousin’s Queens apartment. At first we didn’t know what had happened. Then, CNN kept reporting of a possible collapse. We then saw the second tower collapse. It was then that I thought of Ken.
We sat around at my cousin’s place, getting reports from relatives across the country. Everyone was safe. As night began to fall, I decided to call Ken’s phone. Ken’s wife picked up. Initially, I was relieved, figuring Ken was home and all was well. She then said she hadn’t heard from Ken all day. I didn’t know what to say. I kept thinking that he must be ok, just stuck somewhere. As it would turn out, he along with another colleague Frank Deming, never made it out of the towers. I told Ken’s wife that I’d call back the next morning to see if all were ok, but never could muster the courage for that call.
The next day, I made my way to Manhattan. I must have walked 200-300 blocks of the magnificent city, just observing the terrible silence that had descended upon it. Nearly every corner of the residential part of the east side had pictures of missing people up. I, along with nearly everyone else, felt the duty to look at each picture, almost out of respect for the missing. That scene would repeat itself from east side to west side, and as I made my way down toward the village. Through the cavernous views down Manhattans avenues, the smoke was visible looking southward. Having lived in New York for many years, I always knew that New Yorkers were some of the toughest people on earth. On this day, they displayed their humanity. It was a stunning sight.
It would be several more days before I made it back home, but it was hard to look toward lower Manhattan for the longest time. A couple of months later, as travel picked up again, I returned to New York for an assignment. This time it was at the American Stock Exchange. Just blocks from ‘Ground Zero’. After making it through layers of security toward the exchange, we were ushered up to a higher floor to begin our review. The conference room that we used for the next several weeks had a direct view into the ‘pit’. Those days at the exchange were some of the most difficult working days in my professional career. I’ve continued to return to New York for work since that day, often to lower Manhattan. It’s never easy to go back to that place, but life does move on.
On that day, I would later learn, that a close friend from graduate school was working at the Pentagon when he was blown out of his chair from the impact of the third plane. He managed to get out just fine from the Pentagon. In October, my friends from graduate school all gathered in Washington, DC to honor those lost in the Pentagon.
A lot has happened since this day a decade ago, and arguably the country has changed. But, through all the noise and posturing that would follow in the ensuing decade, the memory of those lost on that day hasn’t faded. Passengers on the ill-fated flights, the firemen, New York’s Bravest, the policemen, New York’s finest, the Port Authority policemen, the workers and visitors to the towers, those who lost their lives in the Pentagon, and the Hero’s of United flight 93 that began the defense of the homeland while crossing Pennsylvania. This day should be to honor them. For me, it’s a day that, a decade ago, I lost my friends Ken Zelman and Frank Deming. Much has changed in a decade, but they are not forgotten.
Tagged: 911, New York City, Pentagon, Shanksville, World Trade Center
Earlier today, Oracle announced an agreement to acquire knowledge management vendor InQuira. Given InQuira’s deep integration with legacy Oracle products, and despite partnerships with SAP and Genesys, it was just a matter of time that Oracle absorbed InQuira. R.Ray Wang explains why Oracle finally pulled the trigger:
InQuira “is one of the top knowledge management vendors in the business,” said analyst Ray Wang, CEO of Constellation Research. “They’ve been positioning for a sale to Oracle or SAP for the past 24 months.”
While knowledge management is “a critical component” of CRM systems, most have “a big gap in this area,” Wang added.
It might seem that a vendor such as Oracle, which already had content management and enterprise search capabilities, could build out its own knowledge management system. But the fact is that knowledge management is “a specialized niche,” not only in terms of technology but the customer base, Wang said.
The last point that Ray makes is key, knowledge management is a specialized niche, a niche that ultimately wasn’t big enough to sustain standalone vendors. To put it another way, enterprise KM turned out to be a small pond.
Having spent many years in that pond, I can tell you it was a tough place to swim. While we all might have thought we were creating a blue ocean for ourselves, market realities and shortsighted sales strategies ended up creating a red ocean. Knowledge management in this red ocean required access to channels, or flows of knowledge. That meant ultimate dependency on the owners of those channels – the CRM vendors. So it comes as no surprise that the last fish in that small pond gets gobbled up by the largest CRM vendor.
The era of bulky, on-premise enterprise KM is over, but that doesn’t mean a blue ocean doesn’t exist for KM.
Forrester’s Kate Leggett also offers some good insight on the announcement as well.
Tagged: Customer relationship management, Inquira, Knowledge, Knowledge Management
We all know Wolfram for their Mathematica and Wolfram Alpha products, so the arrival of Computational Document Format (CDF) shouldn’t come as much of a surprise:
The idea is to provide a knowledge container that’s as easy to author as documents, but with the interactivity of apps—for CDFs to make live interactivity as everyday a way to communicate as spreadsheets made charts.For too long, authors have had to aggressively compress their ideas to fit down the narrow communication pipe of static documents, only for readers at the other end to try to uncompress, reconstruct, and guess at the original landscape of information. Static documents are like a very lossy format, fuzzifying clear and fuzzy thinking alike, disguising problems, and often resulting in overwhelming communication failures: undeployed R&D, misunderstood risks, and wrong management decisions, not to mention limiting the flow of information intrinsic to education.Static documents take their share of the blame in making us “information rich, but understanding poor”, to repurpose the common saying.With CDFs we’re broadening this communication pipe with computation-powered interactivity, expanding the document medium’s richness a good deal. Actually we’re also improving what I call the “density of information” too: the ability to pack understandable information into a small space—particularly important on small screen devices like smartphones.
via Wolfram Blog : Launching the Computable Document Format CDF: Don’t Compress the Idea, Expand the Medium.
What’s interesting here is the attempt to add learning and knowledge traits to an actual knowledge container. One initial downside of this new format is that it requires yet another browser plug-in (and a large one at that). Anyway, this is certainly worth keeping an eye on.
- Wolfram Launches Computational Document Format (news.slashdot.org)
- New file format allows journalists to create interactive infographics (blogs.journalism.co.uk)
- Wolfram starts up CDF format for ‘live’ documents (electronista.com)
Filed under: knowledge, learning Tagged: Conrad Wolfram, File format, Interactivity, Mathematica, Plug-in (computing), Wolfram Alpha, Wolfram Research
Interesting notes related to a book that was recently published, and is probably worth adding to my list of books to read:
The only difference between the current fad for ludology — the study of games — and any other time in the history of the internet is that now that marketers and entrepreneurs know that humans have a weakness for game mechanics, and that we can be trained to do almost anything as long as it involves a reward, however ephemeral, they are actively pursuing gamification as the latest and most sophisticated strategy for selling us stuff and/or capturing our free time.
- As websites become games, understand the trend with the Gamification Encyclopedia (thenextweb.com)
- Platform adds gaming elements to any website or application (springwise.com)
- Game Dynamics of Learning: The Gamification of Training and Performance Improvement (compassioninpolitics.wordpress.com)
Filed under: Quotes Tagged: Game design, Game mechanics, Game studies, Gamification, MIT
For those of you who follow the social media landscape as it relates to businesses, please join an interesting bunch of folks at today’s Global Social CRM meet-up. While some of us will take advantage of being at a Cisco Telepresence site, you can join via Justin.tv or via WebEx. Details of the meet-up, and how to attend remotely, are at the link below:
Meeting will start with introduction video by Paul Greenberg, followed by master interviews with Natalie Petohouff, Ray Wang, Mitch Lieberman, Brian Solis and Frank Eliason.
They will cover the following topics:
- How SCRM plays in the world of enterprise applications;
- Justifying Communities in SCRM planning;
- How does SCRM work for SMB?
- How communities grow from Twitter..
This will follow panel discussions with LaSandra Brill [CISCO], Katy Keim [Lithium], Munish Ghandi [HyLy], Peter Grambs [Cognizant], Kira Wampler [Ant's Eye View] moderated by Esteban Kolksy.
Topics that will be discussed are:
- Using online communities for SCRM;
- How are communities being used today?
- How has the advent of the Social Customer changed communities?
- How are businesses reacting to these new social-networks-as-communities?
- Is there a business justification for using communities?
- What can we expect going forward?
- How can a business benefit from using communities?
- Case studies.
I’m looking forward to this lively discussion.
Filed under: Business, Collaboration, Cust. Service and Support, Enterprise 2.0, Social CRM Tagged: Enterprise 2.0, meetup, scrm, social crm
It’s hard hard for me to imagine that, despite all the writing (and tweeting) that I’ve done over the past year, I haven’t updated this blog in nearly a year! I guess there are many reasons, but instead of dwelling on them, I figured the time was right to get back in the swing of things. First, a quick housekeeping notice, I’ve moved off of my own instance of wordpress and onto WordPress.com. There may be some broken links or missing pages, but I’ll try to get those back up asap.
The last time I posted here I linked to a great photo collage on India, so why not kickstart this weblog with another inspirational link about the subcontinent?
The video below is an animation that was done by Arjun Rihan as his graduate school thesis at the University of Southern California. I first met Arjun several years ago, as he was contemplating a career change (he was at Oracle at the time, if I remember correctly). He followed his passion, and as the clip below demonstrates, became a creator, and a first rate story teller. Arjun recently put Topi up on YouTube, and you can read about the process to the final product on his weblog. He now works at Pixar:
Filed under: Administrative, India Tagged: Administrative, Animation, India