Moviri blog

Moviri at the Splunk .conf2011

San Francisco, Aug 15 through 17, 2011

Splunk user conference 2011 badge Moviri“The 2nd annual Splunk worldwide users’ conference”. The title itself explains better than anything else what Splunk cares about and which is its primary market. This summer we decided to pay a visit to .conf2011 and see if other users and customers really share the same enthusiasm we experience in Italy.

The conference spanned over three days: the first one consisting of optional training classes called Splunk University and the rest made of more than 50 speeches, packed in 5 tracks over 10 sessions.

The registration process was as smooth as it gets and we received some partner brochure, a beer sleeve and a useful Splunk Quick Reference Guide that will make our next new customer very happy…

We arrived at the conference main stage after the “university”, just in time to socialize and get a beer (that’s what the sleeve was for!) at the “Birds of a feather” session. This informal, bring-your-own-ideas session was built around some major topics, which had been voted by the attendee community, that the participants could freely dispute. We got to know some of the 600+ attendees and noticed Splunk users are not only of the technical type.

Next on the agenda was a visit to the Splunk Solutions Lab, where we could get our hands on the newly released applications and interview their creators. The typical motto was: you just can’t do all of this with anything else. Splunk already had a comprehensive ecosystem of extension applications and additions like Splunk App for VMware, Splunk App for Microsoft Exchange, Splunk App for Transaction Profiling and Splunk App for Web Intelligence will greatly help current and new customers monitoring these complex environments.

Day Two

Splunk User conference 2011 keynoteThe second day started with the keynote. CEO Godfrey Sullivan shared his vision with us: enable Splunk users dominate the three major forces that shape IT today: the Cloud, Big Data, and the Consumerization of IT consumption. Machine-generated data will be accessible, usable and valuable to everyone. Thanks to Splunk, of course. It was then the turn of Eric Swan, co-founder and CTO, who highlighted the novel ways Splunk has been employed this year and shared some insights on the forthcoming release. Paul Sanford, Director of Product Management, described the new developer platform being built at the brand new Seattle office. Christina Noren, SVP Solutions, detailed how Splunk will grow its application offering to provide more out of the box value to end-users. Rob Das, co-founder and chief architect, unveiled a new, cloud-based product offering: Splunk Storm. A pay-as-you-go version of Splunk suitable to analyse data being generated in the cloud.

After the keynote we headed to our first session of choice: Building a solutions business on Splunk. Splunk partners from Sideview, SPP and Aplura explained how they are currently doing business through Splunk and what are the major issues they face while either developing custom applications or providing consulting services. We totally agree with what they reported: as Splunk enables new use cases, it’s very difficult for a customer to go over the “that can’t be done” idea and embrace Splunk not as a product, but as a platform onto which build custom solutions.

We then attended the official presentation of Splunk Storm. This cloud-based offering is basically made of two components: Splunk as we know it, plus the infrastructure to automatic provisioning, scaling and backing up. Users will be able to stream their applications’ log data to specific endpoints and have the ability to analyze it with Splunk without the need of a local deployment. We think a cloud-based solution is a must-have for most modern software providers but the Italian market is not yet ready due to a somehow slow cloud adoption (maybe due to strict Data protection authority measures?).

During the following session, Achieving enterprise level security visibility using Splunk, General Electric shared their experience while using Splunk as a SIEM to analyze more than 100 billion monthly security-related events. They have gone far integrating an automated asset discovery tool with Splunk. Their lesson learnt has the flavour of a best practice for almost anyone:

  • categorize events through event types;
  • documentation, documentation, documentation of all the fields available for any type of data;
  • training of all the internal customers.

Splunk User conference 2011 UI testing sessionAfter the afternoon break, a large american university described how they had grown Splunk from a point solution to a platform. Their story was perfectly aligned to the company philosophy of starting small then growing big. It is not always an easy path as Splunk can do so many things that sooner or later its owners will want it to do something someone else is already doing. It takes patience but in the end, as this university told the audience, we will all be able to use Splunk for the most diverse use cases, with great satisfaction of its end users.

Just before getting ready for dinner we got the chance of listening to Paul Sanford’s What’s new in Splunk APIs and SKDs. The room was completely full: there wasn’t even space enough for people standing! The new Seattle-based office is all about creating the basis for a developer platform: 3rd party or custom products will be able to use Splunk repository and functionalities. The current work involves releasing new Python, Java and Javascript APIs, polishing and documenting the REST web-services and describing a custom application development as a streamlined process. Having experienced the lack of such tools ourselves we think the Seattle team will greatly help both Splunk customers and partners in expanding the class of problems the product will be able to address.

If that wasn’t enough for the day, the .conf2011 staff had organized a party at the California Academy of Sciences. After a walk through the rain forest, having the opportunity to admire fishes, exotic snakes and butterflies we took the Tour of the universe at the digital planetarium, exploring the universe from the universe point of view. So, in three words: Python and Big Data! The party went on with cool music (the day after we discovered the DJ was Rachel, head of the Documentation team at Splunk), nice food and a flood of beer.

Day Three

To start the third and last day the best way, we enrolled in the Optimizing searches session. We already knew some of the contents due to more than two years of consulting on the topic, but we now discovered why things work like that. Just before the end of the speech, we learnt how to write custom search commands using Splunk’s map-reduce model. What a nice take away!

Under the title Visibility inside your virtual environments, Splunk App for VMware, one of the most awaited apps, has been unveiled. This App stands as a collection point of any information generated in the virtual infrastructure, being able to poll data from both the Virtual Center and the ESX hosts. Now administrators will be able to answer questions like “Onto which host was my VM executing one week ago?” and much more. The current release has a focus on getting data in, and provides basic correlation dashboards. Version 2.0 will include many more dashboards and out of the box analyses. We can’t wait to see this in action.

To take a break on all the hot stuff coming out, we enrolled in the User Interface testing labs to provide our own feedback on the upcoming version of Splunk. We wanted to contribute somehow. And the 32nd floor of the Westin St Francis offered a magnificent set up.

Last but not least was the Roadmap feedback session where each of the Core, Storm, Dev, Windows and Solutions teams detailed its own vision on future work. Long-term roadmaps are never a statement of actions, but the message seemed clear:

  • improve the user experience, by automating as much as is possible and expanding the configuration capabilities through the administration interface
  • improve resiliency by hardening the High Availability and Disaster Recovery features
  • build a base platform able to manage both unstructured and structured data, providing services through the current, web interface or third party and custom portals.

Wrapping up

This year’s user conference has been a major celebration of the many use cases Splunk has helped with, presented by the users themselves.

It has been a broad success, too, due both to the number of participants (which roughly doubled since last year) and the audience involvement we experienced. The many technical sessions were mostly addressed to users with minimum-to-intermediate experience on the product, but we were surprised to see so many attendees interested into the advanced ones!

Although we collected some criticism to the product (the Flash charting interface, the complex dashboarding system, too many configuration files, …) these all have one common denominator: they are all about minor aspects of the interface and none of them is about missing functionalities!

If you feel there are important features missing or (reasonable…) things you cannot do with Splunk, voice your questions and we will do our best to help. We have learned a lot in our three days in San Francisco.

Paolo Prigione