Big Data companies to watch

Passive Income Systems

These big data companies are ones to watch

Daly and Newton/Getty Images—OJO Images RFWhich companies are breaking new ground with big data technology? We ask 10 industry experts.

It’s hard enough staying on top of the latest developments in the technology industry. That’s doubly true in the fast-growing area known as big data, with new companies, products and services popping up practically every day.

There are scores of promising big data companies, butFortune sought to cut through the noise and reached out to a number of luminaries in the field to ask which big data companies they believe have the biggest potential. Which players are really the ones to watch?

That question, we learned, is rather difficult to answer.

“A list of ‘big data companies’ is interesting because of the definition,” said Dean Abbott, co-founder and chief data scientist of Smarter Remarketer. “Is a ‘big data’ company one that is pure-play analytics with big data? A company that does predictive analytics often with big data but not always—like Beyond the Arc or Elder Research? A large company that has a group that usually does ‘big data analytics’ but it’s only a small part of what they do as a company—like HP  HP 0.19% , SAP, Dell, even GE  GE -0.02% ?”

‘One of the most interesting ones I’ve seen’

There was certainly consensus on some of the big data companies that industry experts said were notable. At least two of 10 experts we polled named MapRMemSQLDatabricks,PlatforaSplunkTeradataPalantirPremiseDatameer,ClouderaHortonworksMongoDB, and Trifacta as shining examples in the space.

MemSQL, for example, is “an in-memory relational database that would be effective for mixed workloads and for analytics,” said Svetlana Sicular, an analyst at Gartner. “While SAP draws so much attention to in-memory databases by marketing their Hana database, MemSQL seems to be a less expensive and more agile solution in this space.”

Splunk, meanwhile, “has an excellent technology, and it was among the first big data companies to go public,” Sicular said. “Now, Splunk also has a strong product called Hunk—Splunk on Hadoop—directly delivering big data solutions that are more mature than most products in the market. Hunk is easy to use compared to many big data products, and generally, most customers I spoke with expressed their love to Splunk without any soliciting on my side.”

Palantir Technologies, which focuses on data analysis for public sector clients, also received high marks. “I’d have to put Palantir at the top of the list” of the startups in the big data space, said Tom Davenport, an IT management professor at Babson College and author of the book, Big Data @ Work.

DJ Patil, vice president of product at RelateIQ, said San Francisco-based Trifacta—which makes a “data transformation platform” promising increase productivity for data analysts—was a company to watch. “One of the most interesting ones I’ve seen,” said Patil, who serves as a technical advisor to the company.

Two miles away, cross-town peer Datameer is also remarkable, said Carla Gentry, founder of Analytical-Solution. “There are lots more companies out there, but most of the them just end up being a BI platform that anyone with an engineer could have started.” Datameer is different, she said.

‘Graphs have a great future’

Some of the less well-known companies received the highest praise.

Tamr, for instance, is “an exciting startup in data curation, so that would be my nomination,” said Gregory Piatetsky-Shapiro, president and editor of KDnuggets.com.

Neo Technology, the company behind open source graph database Neo4j, is another that Gartner’s Sicular pointed out. “I think graphs have a great future since they show data in its connections rather than a traditional atomic view,” she said. “Graph technologies are mostly unexplored by the enterprises but they are the solution that can deliver truly new insights from data.” (She also named PivotalThe Hive andConcurrent.)

DataStaxWibiDataAerospikeAyasdi and ClearStory were all part of analyst Curt Monash‘s “obvious inclusion” list, he said, while AutomaticPlanet LabsSight MachineDataPad,InteranaWise.ioLendUpDeclaraSentinel LabsFliptopSift ScienceImport.io and Segment.io were among those named by data scientist Pete Skomoroch.

Paxata and Informatica were both cited by Ovum analyst Tony Baer; IBM  IBM 0.18% SyntasaActian and Tableau were four named by George Mason University professor and data scientist Kirk Borne.

“There are a number of startups in security and machine learning that are emerging,” Baer said. “What’s missing right now are startups that look at data governance, stewardship, lifecycle management for big data. Right now IBM is largely alone, but I’m expecting there will be more startup action to come.”

‘Most of these companies will go away’

If you’ve reached this point in the article, you will have read 42 recommendations by our panel of experts. All of them are foremost technology companies; most exist specifically to perpetuate big data technology.

But some experts said that the most interesting big data companies aren’t big data companies at all. Established companies with traditional products and services are starting to develop offerings based on big data, Davenport said. Those include agriculture giant Monsanto  MOO , back-office operations stalwart Intuit  INTU -0.16% , and the trucking company Schneider.

“To me, it’s much easier to create a startup than it is to change your entire way of thinking about information and how it relates to your business operation,” Davenport said. “One of the really exciting things about big data is when you use it to create new products and services.”

He added with hesitation: “It’s early days still, and we don’t know how easy it will be for companies to make money off these things.”

It is inevitable that there will eventually be a thinning of the big data herd, experts said.

“Most of these companies will go away because the most important part of the big data movement will be how to use data operationally—to make decisions for the business,” Smarter Remarketer’s Abbott said, “rather than who can merely crunch more data faster.”



Accenture Revenue Analytics and Business Intelligence

Passive Income Systems

Revenue Analytics and Business Intelligence

Overview

Why Accenture

Specific Services

Revenue agencies face an environment of electronic filing, reduced resources, new taxpayer expectations and complex fraud schemes.

  • How can revenue agencies use data to improve risk management practices?
  • How can revenue agencies use data to better understand taxpayer behaviors?
  • How can revenue agencies improve debt collection, returns processing and audit activities?

More and more, leading revenue agencies are capturing significant amounts of data from taxpayers and third party providers. Many are asking questions like these and want to better analyze this data to understand taxpayer behaviors, improve fraud detection and risk management and enhance compliance enforcement.

Some are looking to business intelligence and predictive analytics to create new value from data. They are using quantitative methods to derive actionable insights and outcomes from data to meet these objectives and improve agency functions. Standard practice in the private sector, analytics can help revenue agencies achieve high performance by:

  • Generating more revenue.
  • Enhancing compliance via targeted compliance enforcement activities.
  • Improving audit, collections and returns processing procedures.
  • Optimizing use of personnel and resources.
  • Improving and streamlining taxpayer service.
  • Customizing service delivery channels based on taxpayer preference.
  • Improving operational visibility.
  • Understanding voluntary compliance.
  • Enabling faster and better decision making.
  • Optimizing the return on existing business and technology investments.
  • Reducing tax fraud.


Big Data Today

Passive Income Systems

Incredible Ways Big Data Is Made use today’s technology

 

143891-1

 

 

 

 

 

 

 

 

 

The term ‘Big Information’ is as massive in management and technology as Justin Bieber and Miley Cyrus are in music. Like with various other mega buzzwords, lots of claim huge information is all talk and no activity. This couldn’t be further from the fact. With this post, I would like to demonstrate how huge data is made use of today to include actual worth.

Eventually, every aspect of our lives will be affected by large information. Nonetheless, there are some locations where big data is currently making a real difference today. I have classified the application of big data into 10 areas where I view one of the most extensive use along with the highest perks [For those of you which wish to take a go back below and comprehend, in basic terms, just what huge information is, check out the posts in my Big Information Guru column]

1. Understanding and Targeting Consumers
This is just one of the most significant and most advertised locations of huge information use today. Right here, huge data is utilized to far better know consumers and their habits and preferences. Firms are keen to broaden their standard data collections with social media data, web browser logs as well as text analytics and sensing unit information to get a much more total image of their clients. The big objective, in many cases, is to create anticipating versions. You could remember the instance of UNITED STATE retailer Target, who is now able to really properly forecast when one of their clients will certainly expect a child. Making use of huge information, Telecommunications companies can now better anticipate client spin; Wal-Mart can forecast just what items will certainly offer, and car insurance coverage firms know how well their consumers in fact drive. Also federal government election campaigns could be optimized making use of large information analytics. Some think, Obama’s succeed after the 2012 governmental election campaign was because of his group’s remarkable capability to utilize huge information analytics.

2. Understanding and Optimizing Business Processes
Huge data is also increasingly utilized to maximize company processes. Stores manage to optimize their stock based on forecasts produced from social networks information, web search trends and weather report. One certain company process that is seeing a great deal of huge data analytics is supply chain or delivery route optimization. Right here, geographical positioning and radio frequency recognition sensors are utilized to track goods or shipping cars and maximize routes by integrating real-time web traffic information, and so on. Human Resources company procedures are also being improved utilizing large data analytics. This consists of the optimization of talent purchase Moneyball design, in addition to the measurement of business society and team involvement making use of big data devices.
3. Individual Metrology and Efficiency Optimization

Large information is not merely for companies and governments however additionally for all of us independently. We could now take advantage of the data created from wearable tools such as smart watches or smart bracelets. Take the Up band from Jawbone as an instance: the armband accumulates data on our calorie usage, activity levels, and our sleep patterns. While it offers people rich ideas, the genuine value is in evaluating the collective information. In Jawbone’s case, the company now collects 60 years worth of rest data every night. Evaluating such quantities of information will certainly bring totally new insights that it could feed back to specific users. The many others location where we take advantage of big information analytics is finding love – online this is. A lot of on the internet dating sites apply huge information tools and algorithms to locate us the most proper matches.
4. Improving Health care and Hygienics

The computer power of huge information analytics enables us to decode entire DNA strands in minutes and will allow us to find brand-new cures and much better know and predict condition designs. Simply consider just what happens when all the individual information from wise watches and wearable devices can be used to apply it to countless people and their various diseases. The professional tests of the future will not be restricted by small example sizes however can potentially consist of everybody! Big information techniques are currently being utilized to monitor infants in an expert early and sick baby system. By taping and evaluating every heart beat and breathing design of every infant, the system managed to create algorithms that can now forecast infections 24 hours prior to any sort of bodily signs show up. By doing this, the group can step in early and save delicate babies in an atmosphere where every hour counts. Exactly what’s more, large information analytics enable us to keep an eye on and forecast the advancements of upsurges and disease break outs. Integrating information from clinical documents with social networks analytics enables us to keep an eye on flu episodes in real-time, just by listening to what folks are saying, i.e. “Feeling rubbish today – in bed with a cold”.

5. Improving Sports Efficiency
Most exclusive sports have actually now embraced big information analytics. We have the IBM SlamTracker tool for tennis competitions; we use video analytics that track the performance of every player in a soccer or baseball video game, and sensor innovation in sporting activities devices such as container balls or golf clubs permits us to obtain comments (using cell phones and cloud servers) on our game and the best ways to enhance it. Lots of exclusive sports groups likewise track sportsmens away from the sporting environment using wise innovation to track nutrition and rest, and also social media discussions to monitor psychological health and wellbeing.

6. Improving Science and Studio
Science and studio is currently being transformed by the new opportunities huge data brings. Take, for example, CERN, the Swiss nuclear physics lab with its Huge Hadron Collider, the globe’s largest and most highly effective particle accelerator. Experiments to open the tricks of our cosmos just how it began and functions – generate huge amounts of information. The CERN data center has 65,000 processors to analyze its 30 petabytes of data. However, it makes use of the computer powers of hundreds of computer systems distributed across 150 data centers worldwide to evaluate the information. Such computer powers can be leveraged to transform so many various other areas of science and medical.

7. Optimizing Machine and Gadget Efficiency
Large information analytics help equipments and gadgets become smarter and more independent. For example, huge data devices are utilized to operate Google’s self-driving auto. The Toyota Prius is matched with electronic cameras, GPS in addition to powerful computers and sensors to safely drive when driving without the intervention of human beings. Big information tools are also made use of to maximize energy grids making use of information from smart meters. We could even use large information tools to enhance the performance of computers and information storage facilities.

8. Improving Security and Police.
Big information is used greatly in boosting protection and enabling police. I make certain you recognize the revelations that the National Security Firm (NSA) in the UNITED STATE uses huge data analytics to foil terrorist plots (and perhaps spy on us). Others make use of big data techniques to find and prevent online assaults. Polices make use of large information tools to capture bad guys or even forecast criminal activity and bank card companies use large data use it to identify deceptive purchases.

9. Improving and Optimizing Cities and Countries
Big information is used to boost lots of aspects of our cities and nations. As an example, it permits cities to maximize visitor traffic flows based on real time web traffic information along with social networking sites and weather data. A number of cities are presently piloting huge data analytics with the purpose of turning themselves into Smart Cities, where the transportation infrastructure and utility processes are all joined up. Where a bus would await a delayed train and where traffic indicates forecast traffic quantities and operate to decrease jams.

10. Financial Trading
My final classification of huge data application originates from monetary trading. High-Frequency Trading (HFT) is a location where large information discovers a bunch of use today. Below, big data algorithms are made use of to make trading choices. Today, the majority of equity trading now happens by means of information formulas that significantly think about signals from social networks networks and news web sites to make, buy and sell decisions in split secs.
For me, the 10 categories I have outlined right here stand for the locations where huge data is used one of the most. Certainly there are many various other applications of big information and there will certainly be lots of new classifications as the tools come to be more widespread.

Big Data

 



Big Push into Big Data

Passive Income Systems

Midsize Companies Plan Big Push into Big Data

Ann All
Ann All |   ARTICLES   |   POSTED 28 APR, 2014

     Print     |     Email     |     Share   

Big Data is not just within the purview of big companies. That’s the takeaway of new research from Competitive Edge Research Reports. The firm’s study, which was commissioned by Dell, found that 41 percent of midsize companies are already involved in one or more Big Data projects while 55 percent are planning initiatives.

Midsize companies that have moved from planning into implementing Big Data projects are reporting successes, the research found.

About half of respondents with existing initiatives gave themselves high marks in six strategic tasks: improving speed and quality of decision-making, improving product quality, identifying and taking advantage of business opportunities, understanding constituent sentiment, understanding customer needs and predicting future trends that may impair achievement of business goals. In contrast, fewer than 30 percent of respondents that were still just planning Big Data projects scored themselves well in most of these categories.

The gap was greatest in the improved decision-making category, in which 50 percent of companies with Big Data in production said they performed “very well” in this area, vs. 23 percent of companies that are still in the planning stage.

Big Data Spending

Buoyed by these early wins, respondents expect to boost their Big Data budgets by at least $2 million over the next two years. The research firm predicts the average Big Data budget will reach $6 million during this time frame.

“The early success midmarket companies are seeing with their Big Data initiatives will encourage more growth and investment, and additional returns on that investment will be achieved as they dive further into different datasets and embrace ever-improving analytic capabilities,” said Darin Bartik, executive director, product management, information management, Dell Software, in a statement.

The study offers insight into the tools that will likely win be included in Big Data budgets. The three tools ranked as most valuable were: real-time processing of data and analytics, predictive analytics and data visualization, to convert processed data into actionable insights.

Big Data Challenges, Best Practices

While the study found many positive trends, some daunting Big Data challenges still remain. Named most frequently as challenges: wide variety of new data types and structures, cited by 40 percent of respondents; sheer volume of data slows processing, mentioned by 34 percent; and budget limitations to improve data analysis capabilities, 32 percent.

Perhaps the most valuable insights in the study are the factors respondents named as key to their Big Data success. Forty-one percent of them tapped strong collaboration between business units and IT organizations as a success factor. Similarly, IDC Retail Insights recently noted that companies with more advanced Big Data capabilities were far more likely than their less mature peers to involve both business and IT in their Big Data projects.

4 Tips on Choosing a Supply Chain Planning and Optimization (SCP&O) Solution

IDC Retail Insights recommends that companies establish a collaborative governance structure involving lines of business, IT and a separate analytics group. With Big Data initiatives, high achievers tend to put IT in a leadership role, a finding that runs counter to the common practice of having lines of business lead technology-driven business initiatives, IDC Retail Insights found. Under such models, IT is responsible for overall strategy, planning and application development, with business responsible for evaluating the capabilities created by IT and the ultimate business outcomes. The analytics group handles management of data, content and analytics.

The Dell-commissioned research recognizes “encouraging signs of shared responsibilities taking shape among the management ranks.” For example, while 76 percent of respondents said IT was the most responsible for implementing Big Data projects, 56 percent mentioned sales management in a leadership role. This emphasis on customer-facing issues reinforces a need for close collaboration, the report notes.

Ann All is the editor of Enterprise Apps Today and eSecurity Planet. She has covered business and technology for more than a decade, writing about everything from business intelligence to virtualization.

 



IBM’s Multi-Billion-Dollar Cloud/Big Data

Passive Income Systems

Looking Far Ahead: IBM’s Multi-Billion-Dollar Cloud/Big Data Analytics Strategic Initiative

Rob Enderle
Rob Enderle |   UNFILTERED OPINION   |   POSTED 11 JUL, 2014

     Print     |     Email     |     Share   

Slide ShowThe Differences Between Hardware Design and Software Development

Yesterday, IBM announced its massive $3 billion investment effort to create a processor uniquely suited to the demands of the cloud and Big Data systems. This has both strategic and timing implications to the market. Efforts like this speak to the level of maturity in the market as well as IBM’s commitment to leading it. The announcement anticipates a massive improvement in processor scaling down to 7 nanometers and begins to flesh out what the Big Data world will look like in 2020.

 

Why Do You Care About 2020?

Often the mistake that both technology companies and IT managers make is focusing too much on the tactical and working to solve the problems of today. This tactical focus generally puts everyone in firefighting or whack-a-mole mode, constantly on the verge, and sometimes over it, of being overwhelmed by the changes they can barely keep up with.

The reason to maintain at least a five-year view is so that strategic efforts like Big Data analytics, which consume massive resources, aren’t prematurely obsolete and you can anticipate the world that will exist once they have matured. This eye on the future is often what distinguishes firms that survive decades from those that don’t make it to their first 10-year anniversary. The surviving firms have anticipated and prepared for changes.

What the IBM Announcement Means

This announcement means the market has reached the point where large solutions providers are beginning to build solutions from the ground up, not cobble together technologies that were designed for other things into a kludge that sort of works. The data analytics and cloud solutions increasingly demand a level of performance and granularity that wasn’t imagined when current chip technologies were incubated, and big vendors like IBM now understand what current technologies don’t do. Thus, they have a roadmap to create something that works better.

 

5 Tips for Better S&OP Scenario Planning

They are now looking beyond 7nanometer technology into quantum computingcarbon nanotubes and neurosynaptic computing. The first two are being worked on by a number of vendors now and promise, but have yet to deliver, massive improvements in processing speed and levels of security well beyond what we have today. Neurosynaptic computing promises a near-human-like capability to anticipate and proactively respond to future events by systems, taking products like IBM’s Watson to a future where they largely train themselves and can actually figure out more quickly the answers to questions that users haven’t even thought to ask. Carbon nanotubes (CNTs), in particular, could help define the smartphones and personal devices of the future. 

For those next-generation devices, which will be accessing these increasingly intelligent Watson-like back-end systems (think of Siri or more likely Microsoft’s Cortana on steroids), you’ll need advanced, low-powered transistors that are far more efficient. Many of those devices will be wearable, and to keep heat and power costs down to manageable levels, the data centers of tomorrow will find CNTs and Tunnel Field Effect Transistors (TFETs) critical. Finally, graphene will be critical to the future, as silicon runs out of performance headroom and a more efficient and powerful alternative is increasingly required.

They are layering on advancements in silicon photonics, which massively improves the speed of data transport because Big Data cloud-hosted jobs tend to be very fluid with regard to where they are located and have to be moved depending on their relative importance to the company and the proximity of the user.

Wrapping Up: How IBM Lasts

IBM’s announcement both showcases that the cloud and Big Data analytics requirements of today can now be projected into the future and that it is turning its massive R&D engine into creating the technologies that future will require. This is a massive effort by IBM, showcasing the kind of commitment that made it the longest-lasting technology vendor in the market. The firm is a survivor largely because it is able to step outside of the day-to-day tactical concerns and invest in the world of tomorrow so it has a place in it. That’s a decent example that more firms in every segment should follow.

 



THE PERFECT STORM OF BIGDATA, CLOUD AND INTERNET-OF-THINGS

Unique Traffic Generation Wordpress SEO Plugin by SEOPressor