Polygraph Media Announces Launch of its Big Data Analysis Tool for Facebook Marketing

Polygraph Media is launching their Facebook data mining reports product today. It’s great to see company founder Chris Treadaway bring this product to market, after seeing many of the early versions over the past several months. With all the hoopla surrounding the Facbeook IPO, it’s a great time to remember that Facebook is really the platform where brands engage with their constituents. As such, analysis of your brand’s engagement on Facebook is key. With a data science approach, Polygraph works in real-time to collect data from Facebook Pages and subscriber-enabled Profiles, and quantifies it in over 40 charts, graphs, and data visualizations to give marketers actionable intelligence to inform marketing campaigns and strategies.  As a “truth detector” for Facebook marketing, Polygraph Reports are ideal for brands with a major Facebook presence and the agencies or consultants who work with them.

To try to understand the power of their platform, we asked for an example report of the top luxury resorts in Las Vegas. You can see that report here.

“Facebook has grown so fast that companies are now significantly increasing their marketing spend on Facebook Pages and Advertising,” said Polygraph founder, Chris Treadaway, who is also the book author of “Facebook Marketing: An Hour A Day.”  “But increasingly, we are hearing that marketers need to justify the increased attention and budget.  Is it all worthwhile?  If we spend more money or human resources on Facebook, what will we get out of it?  The answers lie in the data — every social interaction that takes place on a Page is a discrete and important data point that can be mined for information.  We created Polygraph to serve this need — an analytics solution that can look at your Pages and those in your competitive environment, and can clearly survey the Facebook battlefield for marketers, executives and agencies.”

Polygraph brings the power of data science (mining, analysis and reporting) to anyone who wants to analyze business to consumer activity on Facebook.  Polygraph Reports cover all major Facebook marketing scenarios, including community management, content marketing, top content and competitive benchmarking. Additionally, Polygraph can identify the Top 1,000 Fans of any Page on Facebook — essentially a “Klout” for Facebook Pages.  The result is a concise report that not only shows marketers the “cause and effect” of Facebook marketing strategies and tactics, but also allows innovative agencies the opportunity to create custom campaigns based on the data.

“We’ve all heard how if Facebook were a nation, it would be the third largest country on Earth. So it’s common sense that brands have focused so much attention here,” says Brad B McCormick, principal at 10 Louder Strategies and a former senior digital leader at Ruder Finn, Porter Novelli and Cohn & Wolfe global agencies. “But just because a company can be on Facebook doesn’t mean they know how to be on Facebook and measure success. Acquiring Facebook “likes” is just the first step for brands.  Ongoing engagement that builds brand equity is the holy grail of Facebook.  But all too often today, brands and agencies are measuring success with empty platitudes and without data or relevant benchmarks,” McCormick continued. “Polygraph Media offers by far the most robust Facebook analytics I have ever seen. It not only gives brands new insights into their own performance, it allows them to compare each of their KPIs against their competitors, in real time. It’s a game changer.”

Polygraph is available to any interested users, but the largest pockets of interest to date, and strongest beta testing feedback, have come from three different groups:

  • Agencies like those McCormick has worked with, whether public relations, advertising, marketing or new social agencies, who are supporting clients in their social marketing endeavors, and often challenged to buy – or build – tools that can pull actual data and actionable analytics.
  • Social consultants and independents who are experts in social media, with the same sort of clients and needs.
  • And Brands themselves, who are ending the road on “just do social,” and moving to the next phase in which they must determine real ROI on social initiatives and strategically manage their social programs.

These brands – or their agencies and consultants – use data-science-based Polygraph insights to achieve numerous strategic objectives, including:

  • Deepening relationships with customers – identifying most influential customers; understanding likes and dislikes, patterns, commonalities; activating these influentials with relevant offers, promotions, messages.
  • Delivering relevant, timely and impactful content – understanding what interactions are taking place on a page; see what content is most engaging; know what time of day, days of the week the audience interacts with the brand most frequently.
  • Monitoring and benchmarking against the competition – a real-time lens on what the competition is doing when; insights into competitive positioning; a clear comparative of the metrics of the brand and their competitors

A SaaS solution built entirely on Microsoft’s cloud services, the Windows Azure cloud platform, Polygraph is one of the first major big data applications built on Azure.  “We have been working with Polygraph on the development of its social data mining tools and are very pleased with how quickly Polygraph Media embraced and adapted to the Windows Azure platform,” says Rodney Sloan, Principle Platform Strategy Advisor, Microsoft. “Equally exciting for Microsoft and its customers is to have an innovative company like Polygraph Media become a Windows Azure Platform partner.  Polygraph is putting the value of Azure into action.”

While Polygraph will offer data mining tools – and analysis – for all social platforms (Twitter, LinkedIn, YouTube) later in 2012, the company deliberately is starting with Facebook, where critical information has, to date, been inaccessible, locked away in individual comments, sentiments.  In beta since November, the Polygraph Facebook marketing truth detector is already being used by over 25 brands, major agencies and consultants.  Customers can buy one-time Polygraph Reports or subscriptions for automatic updates and ongoing access.  Pricing is based on the size of Facebook Pages that are analyzed.

Polygraph Media Announces Launch of its Big Data Analysis Tool for Facebook Marketing

Gazzang Selected for MIT Sloan CIO Symposium’s Innovation Showcase

The 9th Annual MIT Sloan CIO Symposium today announced Gazzang as one of ten vendors chosen for the 2012 Innovation Showcase. In selecting Gazzang for the prestigious event, judges noted that the company’s cloud-based Encryption Platform for Big Data represents a cutting edge B2B solution that combines strong value and innovation and has the potential to impact both the top and bottom lines.

“Despite all the hype, big data is still in the early stages and is a market ripe for innovation. As the volume of data continues to grow, organizations are now turning their heads towards big data security and looking for solutions that can protect the massive amounts of sensitive information being generated, while maintaining a high level of availability,” said Larry Warnock, president and CEO of Gazzang. “We are honored to be selected as an MIT Sloan CIO Symposium Innovation Showcase finalist and look forward to sharing our strategy and solutions with industry thought leaders at the event.”

The Gazzang Encryption Platform for Big Data works as a last line of defense for protecting data within Hadoop, Cassandra and MongoDB, non-relational, distributed and horizontally scalable data stores that have become common management tools for big data initiatives. These data stores are known for their ability to process petabytes of data in real time, but they lack many of the built-in security controls associated with traditional database solutions. The Encryption Platform transparently encrypts and secures data “on the fly,” whether in the cloud or on premises and includes advanced key management that meets compliance regulations and allows users to store their cryptographic keys separate from the encrypted data.

“We are very impressed with these top ten Innovation Showcase finalists, as their technologies demonstrate incredible state of the art thinking applied toward today’s and tomorrow’s challenges,” said David L. Verrill, Executive Director of the MIT Center for Digital Business, and the Co-chair of the Innovation Showcase. “The Innovation Showcase provides a terrific one-of-a-kind opportunity for these start ups to gain a larger visibility in front of IT executives, key stakeholders, and venture capitalists.”

Gazzang Selected for MIT Sloan CIO Symposium’s Innovation Showcase

InfoChimps Adds Dashpot To New Platform

Infochimps, the first open data marketplace and a leading provider of tools, content, and operational expertise for big data infrastructure, today announced Dashpot for the Infochimps Platform, an easy-to-use analytics and operations dashboard that provides customers with a powerful suite of tools for business metrics and data visualization, cluster management and system monitoring.

Dashpot enables customers to easily capture and visualize data on the fly as it is being ingested by the Infochimps Platform through an in-stream decoration process powered by Infochimps’ Flume-based Data Delivery Service. The process helps users get insights in near real-time, and ultimately multiply the value of their data. Once fully ingested by the Infochimps Platform, Dashpot also gives customers full visibility and control of their Big Data stack all in one place helping them go from input to insight faster.

“We built Dashpot to dramatically increase the usability and management of the Infochimps Platform and help our customers see their data as it streams in. When users decorate their data with other sources, they can produce completely new insights that can fuel a new level of critical decision making,” said Joe Kelly, co-founder and CEO. “For example, a customer in the retail vertical is using the Infochimps Platform to capture real-time point-of-sale data. 60 seconds after a customer runs a credit card in a store, or gives their email address or phone number, their demographics have been fetched and they can be filtered and viewed in Dashpot. Being able to display real-time transactions with customer profiles, seconds after they occur, stands to change the game for the retailer.”

InfoChimps Adds Dashpot To New Platform

Big Data Has Entered The Building

Unless you’ve been under a big rock, you’ve heard the chat about big data. It describes the tools and practices associated with unlocking the value in large amounts of structured and unstructured data. If that sounds confusing, just think of of how much information is being created daily. Now think of what it takes to find the nuggets that matter. Big data means big business.

According to analyst firm IDC, the big data market will grow from $3.2 billion in 2010 to $16.9 billion in 2015, growing 7 times faster than the information and communications technology (ICT) market.

To get a sense for how fast the space is moving, MIT researcher and author Andrew McAfee points to a project from Allstate that was submitted to Kaggle, who runs data prediction contests.

More than 200 people on 107 teams submitted 1290 entries to try and win the $10,000 prize.  What’s more striking is that the winning entry was 340% more accurate than Allstate’s existing methodology for predicting claims based on vehicle characteristics. Oh, and it took only 90 days. [chart below]

McAfee says that the prediction models, tools and techniques have changed so rapidly over the last decade, large companies are simply outgunned. Predictably, that’s forcing those same organizations to rethink their core competencies.

 

“If a bunch of kids and outsiders, who don’t know our customers, markets, and histories at all, can make much better predictions that we can in critically important areas, what does this imply? Do we need to completely rethink how we make our predictions? Do we need to outsource or crowdsource the making of predictions, even if we currently think of it as a ‘core competence?’

So what other areas are ripe for the reach of big data? Booming industries like oil and gas are one of the obvious segments. There’s tons of activity, an ecosystem of companies providing complementary services, and of course cash on hand to hire the data scientists and number crunchers.  Austin’s own Drilling Info is a case in point. It closed a big round of funding recently, one that shows when industry processes can be transformed, the money will likely follow.

“If you go back 10 years, oil and gas companies would hire consultants to go to repositories of information at the state level, and manually collect data, put it into a spreadsheet and analyze it,” says Deven Parekh, a managing director with Insight. “What we’re really doing now is making that data available through regular workflows on a daily basis.”

Insight’s Parekh also mentions the importance of domain expertise, something often overlooked as organizations dive deeper into analytics.

“In three or four years, the technology itself may become fairly standard, but the key will be in having the deep vertical knowledge.” Of course, as we saw with AllState, data-driven insights might actually alter what some companies see as their real expertise.

Lastly, McKinsey published a report last Fall on the big data movement. One of its notable excerpts was the image below, showing an industry-specific view of big data projects as well as which segments stand to gain the most.

Their report also identified one of the biggest challenges lurking  in large organizations — the dreaded silo.

“One big challenge is the fact that the mountains of data many companies are amassing often lurk in departmental “silos,” such as R&D, engineering, manufacturing, or service operations—impeding timely exploitation.”

But surely we can unlock all this data with all the technology we have, right? Sure we can. But just in case, let’s hear it for big data’s next hero, the system integrator.

Enhanced by Zemanta
Big Data Has Entered The Building

Infochimps Launches Enterprise Platform

With the launch of the Infochimps Enterprise Platform, the company seems to be making available to the public the very same system that they have been running their data marketplace with. If you’ve ever tried to manage your own Hadoop cluster, and used a variety of tools with crazy names like chef and pig, then you understand the complexities of managing big data. Infochimps has even added a few of their own crazy names into the mix with wukong and ironfan.

The most important (and un-addressed) part of this news is the business model pivot from marketplace to enterprise toolset. It’s not un-expected when the financial backing is coming from a venture firm, and InfoChimps is hardly the first company to pivot into the enterprise, but it does change many things about a company. Primarily, you have to ramp up an enterprise sales organization which is a significant culture changer. Luckily, Austin is deep in enterprise software talent courtesy of a fistful of successful companies such as Trilogy, Tivoli, Vignette, Motive, and more.

Best of luck to our favorite data monkeys. Go crush it!

Infochimps Launches Enterprise Platform

Loku Launches Public Beta Today

For those of you who attended last week’s Beta Summit at Innotech, you’ve already seen the preview of Loku.com. Today, it officially launches in public beta as “the site that reveals what locals care about.” With Loku’s unique “Big Data for Local” contextual search technology, people can find the hidden hot spots, catch the local buzz, discover cool new things to see and do, and avoid the tourist traps—it’s a fun and exciting way to connect with your local community.

In an age when the World Wide Web has disconnected people more than ever, Loku.com is the answer for re-connecting with our surroundings. Loku.com helps folks re-discover their neighbors, find new places and activities, and reclaim the enjoyment and meaning that comes from embracing our neighborhoods, boroughs, towns and suburbs.

“Our local communities are broken. People are not involved. And yet there is a huge interest—even hunger—for local connections. Just look at the move toward locally-grown food and local crafts,” said Dan Street, Chief Executive Officer of Loku.com. “This kind of change helps everyone, even visitors, live a unique life specific to a geographic area. It’s an exciting time. Technology has the capacity to really improve our lives, and we’re using it to improve our relationship with where we live.”

Street, who moved often as a child and never experienced hometown roots, came up with the idea for Loku.com as he traveled as an executive for firms like Bain & Company, Primedia and Kohlberg Kravis Roberts. “I often wondered how people develop a sense of place, particularly if you weren’t born there,” he recalled. “We wanted to create a way to people to see the world through the eyes of a really connected local.”

Doing so, however, was a mammoth process that took 3 years and more than $350,000 of Street’s own money along with another $1 million in angel capital. The result: Big Data for Local, a proprietary technology that analyzes online information, instead of simply assembling or ranking it.

“When you’re looking for local info, most sites give you specific Web pages. We return a guide – a synthesis of what matters.” Street explained. “We’ve built the analytical layer on top of the local Internet.

For example, we can tell you the aggregated sentiment—the ‘buzz’—about a local restaurant or event. We can pull all the information together and present insights in a lively, single-page, graphical format. ‘What are the inside tips that only locals know?’ ‘Which place is generating all the interest?’ It varies by community—in the Mission District of San Francisco, for example, restaurants and safety are more important than in Katy, Texas, outside Houston, where schools and kids get people talking. That’s the kind of locally-focused online experience we can offer.”

Currently available for localities in 15 major markets of the U.S., Loku.com presents the latest local happenings in an intriguing grid format that evokes a work of art. Headlines, social updates, and photos are intermixed on the colorful grid, inviting exploration. Through the top navigation bar, users can shift to a street map that contains the local hangouts and places of interest. Click on a dot and a pop-up appears that reveals the colorful secrets only a local friend could share.

Loku.com brings local news, tips, reviews, opinions, openings, daily deals, and much more, all into one place. “The experience is not unlike talking with a local who is ‘the source’—the one person everyone goes to for good information. That wasn’t accidental,” stated Street.

To sustain Loku.com, Street and his executive team are forging partnerships with service providers in food/drink, entertainment, transportation, financial services, retail goods and more. Rather than resorting to ads, the site will use its partnerships to facilitate local purchases. Group buying sites were was the first to sign up, with others soon to be added. All partners will be opinion-neutral relative to local businesses, to preserve the objectivity of the site’s information.

Loku Launches Public Beta Today

Innotech Beta Summit 2011

If you’re attending the Innotech Beta Summit on Thursday, here’s a rundown on the companies that you will see present. Is it a coincidence that half of the companies presenting are “Big Data” plays? You decide.

If you want to register for the conference (so you can attend the beta summit), you can register here and use the special discount code for AustinStartup readers that will you 25% off. That code is SML25.

I’ve got a special deal for you, dear blog readers. For the first 20 people, you can register for Innotech and get a ticket to Austin Tech Happy Hour (5:30pm at Molotov) both totally free. Use discount code BETA888 to claim it.

SubjectLines maximizes email marketing effectiveness by helping you create subject lines that get your emails opened. Discover what your competitors are sending out, and how effective they are. The analytics are driven by the behavior of more than 500,000 anonymous consumer mailboxes, updated daily. Come see how awesome your emails can be.

Social media is hot. Among all that social data is a ton of insight just waiting to be harvested. Polygraph Media is a social media data mining company that gathers intelligence from billions of daily social interactions on Facebook, Twitter, and YouTube.  Marketers receive actionable reports on social media activity for their own pages and their competitors.  Enterprises use our platform for ongoing monitoring, comment moderation, and custom projects.

At Loku, we make it easy to tap into the local scene.  We use our proprietary Big Data for Local software to uncover the real character of a place, be it a nearby coffee shop or a city far away. On Loku.com you can find the latest news and stories, discover new things to do, and get the inside tips on food and drink around town.

Everyone loves to take pictures of their favorite bands when they’re at a concert, but wouldn’t it be great to see a bunch of fan photos of your favorite band? Vivogig is a mobile app and website that lets fans capture and tag live music photos and compete to earn the top spot on the photo charts while supporting their favorite bands.

Forecast is a fun and simple way to share where you’re going. It’s like Foursquare for the future. Instead of telling your friends where you are now with a check in, you create a forecast to tell your friends where you’ll be later. This simple change from present tense to future tense helps you connect with your real friends out in the real world. Forecast is social networking that’s actually social.

HelpJuice is THE auto-updating help page for businesses. We make sure your customers find the answer to their questions – and from your end, we watch the emails your support team sends out, and keep your Help/FAQ page up to date.

 

Innotech Beta Summit 2011

Dachis Group Launches Social Business Index

I hang out with a lot of software and SaaS types. One of the most frequent questions I get about Dachis Group is around our consulting services, and if we will ever make money in ways other than just hourly services. We’ve been quietly building some incredible “big data” capabilities, and I’m happy to say that today we’re finally launching a glimpse into that data set. It’s called the Social Business Index.

Over the past year, we’ve been utilizing all sorts of these real-time, big data software pieces with crazy names like Hadoop and Cassandra. Then we tracked down about 20,000 of the top companies, and correlated over 26,000 brands to their companies. Next we had to figure out crazy things like what Twitter accounts are owned by people who work at those companies, which blogs write about those companies, etc. By launch time we had somewhere in the neighborhood of 300 million data sources.

What does all this data tell you?

That’s the tricky part. Data is just data. We’ve decided to create some measures and metrics from the data, and to have that manifested visually as the Social Business Index. But that’s really just the tip, and the whole iceberg is waiting for you. We anticipate that other organizations will utilize the data in ways that we could have never imagined.

There are many popular and profitable software companies that help you listen to what consumers and saying. The index really does almost the opposite, by listening to what the companies, employees, and suppliers are saying and doing.

Coverage of the launch has been fantastic, with stories today from TechCrunch, The Next Web, and AdWeek. You can also read several posts about the index on the Dachis Group blog.

Enhanced by Zemanta
Dachis Group Launches Social Business Index

Infochimps Releases Unified Geo Data In One API

I work with a lot of consumer startups that are dealing with location-aware information. From a programmer perspective, it’s just plain hard. Not only is it difficult to find a great source for quality geo data, but then you typically want to correlate it across multiple social systems (Foursquare, Yelp, etc). Some of my favorite code monkeys have just made this problem a little easier.

Infochimps is pioneering a somewhat new category, named data-as-a-service. Today they’re announcing the availability of the Infochimps Geo API (geo-spatial application programming interface). The Geo API enables developers to incorporate geographic data sources and features into their software applications by adding a layer of diverse and rich location information. The Geo API provides data from open sources such as Geonames, the National Climatic Data Center and the American Community Survey, as well as licensed sources such as Foursquare and Locationary.

To show how easy and cool the Geo API is, they had a non-programmer create a cool little sample application. The Travel Guide app helps users find notable travel spots for destinations around the world, placing Wikipedia and Foursquare venues on a map to show interesting museums, parks, and nightlife locations in any city in the world.

“This is a transformational development for the geo data market,” said Flip Kromer, Co-founder and CTO of Infochimps. “Up until now, developers have faced issues and barriers when working with geo data – everything from the difficulty in finding accurate, up-to-date data, to the lack of query-based standards. Infochimps’ goal in releasing our Geo API is to democratize the market’s access to a variety of geo data in an easy to digest form.”

The Geo API delivers an extensive set of features for building social, geo, and mapping applications, including:

  • Disparate Data Sources Unified by the Infochimps Simple Schema (ICSS) – Regardless of its original source, data is organized into a unified schema that makes integrating data from multiple sources quick and easy.
  • More Ways To Ask The Questions You Want with Multiple Locator Options – Standard geographic locators like street address, bounding box, quadkey, and latitude/longitude can be used to query any data source in the Geo API. Furthermore, any dimension that can be mapped back to a location, such as a Wikipedia Page ID or Foursquare Venue ID, can also be used to query.
  • Summarizer Tool Allows for Easy Roll-ups of Data The Summarizer, a unique feature of the Geo API, manages the flood of data when queries return a large amount of matching results. The Summarizer makes data query results more usable by organizing data points into intelligent geographic clusters.

Here’s another neat example. Animated weather station data from 1892 – 2011. We’ve grown a lot of weather stations in the past hundred years.

The Geo API release continues Infochimps’ commitment to making data more accessible, while helping to push the market for application development forward through easy access to a rich variety of data sources and APIs, giving developers the ability to focus on building awesome apps.

To access the Geo API, developers can quickly register for an Infochimps API key and immediately begin building on the data for free, making up to 100,000 free API calls per month.

The Infochimps Geo API is available today with the following data sources:

  • Foursquare
  • Locationary Points of Interest
  • Wikipedia Pages
  • Zillow Neighborhood Boundaries
  • Bundle.com
  • Digital Element
  • US Census
  • American Community Survey 2009
  • National Climate Data Center
  • Geonames
  • Zip Codes
  • Political Boundaries
  • UFO Sightings

 

Enhanced by Zemanta
Infochimps Releases Unified Geo Data In One API