Showing posts with label Cloud. Show all posts
Showing posts with label Cloud. Show all posts

To Splunk or not to Splunk - Either Way Listen to Your Machine Data


Listen to your Machine Data. Yes, Do.

Machine data and log analytics is all the rage these days, but should it be, and why should you invest to gather, centralise and analyse something as boring and mundane as machine data and logs?

Structured, Semi-Structured and Unstructured

We are used to structured data, and storing such in relational databases and more recently in file and blob based data stores. This kind of data has always been interesting, and used in a business and application context. However, today we are starting to latch on to the power inherent in semi-structured and unstructured data. There are a number of innovative things we can do if we index, store, correlate and analyse all kinds of machine generated data. This will only get more interesting with the proliferation of IoT devices gathering telemetry data.

So what can you do with Machine Data and Logs?

Well, there is no definitive list, and really it's quite open to imagination. Data rules the world. Data driven businesses with new business models are popping up everywhere. In short, data has currency.

For the purpose of this post, let's focus on a common and popular use case in Digital/IT making use of machine and log data in an IT Operations context. Leveraging and mining data to improve IT service delivery, availability and performance makes sense, and adds to an IT department's capability and service offering.

Drivers for IT Ops Analytics

Through correlation and analysis of all our machine data we may be able to
  • proactively identify issues
  • predict time and point of potential future failures
  • pinpoint root cause and reduce mean time to restore
  • reduce cost through smarter delivery of services
  • gain insights into environments in new ways to drive digital innovation
to name just a few key points.

So this sounds good, right? We want a piece of that for sure. But how do we go about it, and what kind of tools would we need to deliver such new capabilities for IT Operations Management?

Tools that can help us meet the IT Operations Analytics challenge

The good news is that there are a number of mature products and solutions out there. There are both commercial and open source options readily available. The following table lists a few popular options that are worth looking into in my opinion. It is in no particular order nor an exhaustive list.


Products
Commercial / Open Source
On Premise / Cloud
Commercial
Both
Commercial
Cloud
Commercial
Cloud
Open Source
On Premise
Commercial
Both
Commercial
On Premise


On Premise vs Cloud

Depending on what is important to you and/or your organisation, there is no definite answer on what is the best delivery model for a log analytics solution - on premise or cloud.

Some of the reasons as to why you would go on premise are:
  • Retain full control of your data
  • Flexibility of customisation
  • Data sovereignty, data security and backup concerns
  • Unreliable or low bandwidth links to cloud providers
  • Frequent need to bring back data on premise, egress costs
On the other hand, most of the above cloud log analytics providers are pretty mature, highly available and secure by design these days. In other words, they are Enterprise ready. And of course there is the promise of infinite capacity, so you can ingest data to your heart's content and not have to worry about investing into costly, capital intensive on premise infrastructure.

So unless you are facing major regulatory or compliance hurdles, I'd suggest to give the cloud a go. But do your homework on your projected data volumes and associated costs to avoid bill shock and make sure going into the cloud is indeed the most cost effective path for your business. Major organisations may be able to build their own infrastructure and run at lesser cost than cloud providers.

The Wrap

Hopefully the above thoughts have provided some hints and pointers to get you started on your log analytics journey. Personally, I think the potential is significant, and investing in this space is the right thing to do.

Make sure you have people who are interested in using the technology creatively. Define your use cases, then actively get answers to your burning questions by driving value through analysis and visualisation of your existing log and machine data.

Ah yes, to Splunk or not to Splunk…


Cheers

MB

AdSense Account Stats UWP App Released

Introducing AdSenseAccountStats App

For all those AdSense account owners with Windows mobiles or tablets out there, I recently released a new UWP app that may come in handy. This simple app will allow you to quickly and easily check up on your AdSense performance.

Watch your monetization efforts pay off and your earnings grow

After logging in with your AdSense account and authorising the app for read-only access, you can easily see metrics such as:
  • Earnings Clicks, Cost Per Click
  • Page Views, Click Through Rate and RPM
  • Coverage

Code components that make it work

The app simply leverages Google APIs, specifically the AdSense Management API v1.4, and Google OAuth v2 for authentication of the user.  Basically, all calls to the AdSense APIs have to be authenticated or else they won't work.

So the first problem was to tackle Google OAuth v2 for a UWP app.  There are complexities around the Google provided libraries and support for UWP, and I have touched on these in another blog post.

OAuth v2 Authentication

The following code snippet deals with authentication successfully:

            UserCredential credential;
            try
            {
                using (var stream = new FileStream("client_secret.json", FileMode.Open, FileAccess.Read))
                {
                    credential = await GoogleWebAuthorizationBroker.AuthorizeAsync(
                        GoogleClientSecrets.Load(stream).Secrets,
                        new[] { AdSenseService.Scope.AdsenseReadonly },
                        "user",
                        CancellationToken.None,
                        new PasswordVaultDataStore()
                        );
                }
            }
            catch (UwpCodeReceiver.AuthenticateException ex)
            {
                // Login failed or authorisation not granted
                credential = null;
                await LoginFailed(ex);
            }
            catch (Exception ex)
            {
                credential = null;
                await SomethingWrong(ex);
            }

Creating a client service for your API calls

Once the user has logged on and authenticated their AdSense account, and granted permissions to the app so it can access the user's AdSense details, you then create a client service to use for your API calls.  Pass the credential object created by the code above to the http initializer.

            var adSenseService = new AdSenseService(new BaseClientService.Initializer()
            {
                HttpClientInitializer = credential,
                ApplicationName = "AdSense Accounts Stats",
            });

Setting up and executing an API request

After the user has logged on and allowed access for the app, the next piece is to construct a valid API request, execute it and handle the response in a meaningful way.  This can be achieved along the following lines:

            var earningsRequest = adSenseService.Reports.Generate(startString, endString);
            earningsRequest.Dimension = dimensionString;
            earningsRequest.UseTimezoneReporting = true;
            earningsRequest.Metric = new List<string>{ "earnings","page_views","individual_ad_impressions","clicks",
                                                "page_views_ctr","individual_ad_impressions_ctr",
                                                "page_views_rpm","individual_ad_impressions_rpm","cost_per_click",
                                                "ad_requests_coverage" };

            var earningsResponse = await earningsRequest.ExecuteAsync();

Once you have that response object, you can use it in any way you see fit, for example use its member items as a list source etc.

Hope you enjoyed this post and if you like go grab the app from the Microsoft Store.

Cheers

MB



Quick Look at Application Availability Monitoring using free IBM Cloud Service

I recently wrote about web application availability monitoring using Microsoft Azure and Application Insights.  You can read all about that here. Spoiler alert - Microsoft's offering is pretty impressive 😀.

Not wanting to favour one vendor over another, I figured I'd have a quick look at the IBM equivalent.  IBM's cloud has recently been rebadged. What was previously known as Softlayer and/or Bluemix is now officially named IBM Cloud.

IBM Cloud has a freemium model where some services - the ones deemed "lite" - are free.  Luckily, their Application Availability monitoring service falls into this bucket so that allowed me to have a go and road test the solution. Let's take a look...

IBM Cloud - Application Availability Monitoring




It was relatively simple to get things going, although there seem to have been a couple of quirks and hiccups.  Nevertheless, the following five high level steps will get you going:

  1. Sign up for an IBM Cloud account - the free one will do
  2. From the Catalogue, create a new basic/free CloudFoundry app - seems that availability monitoring can only be connected to a CloudFoundry app on the IBM Cloud
  3. From the Catalogue under the DevOps heading, create a new Availability Monitoring service - connect this to your CloudFoundry app
  4. Create and configure a new synthetic test - these can be for web pages or APIs, single action or multistep tests; pick the worldwide locations from where to run the test, frequency, and response validation rules
  5. Voila, wait for the tests to run and you will start to get response times and success/failure alerts

The visualisations and views are not bad out of the box, although navigation takes a little getting used to.  Following are some screenshots to give you a bit of an idea what you can expect:





The Verdict

In Summary, this is a decent service, and the synthetic multi step tests to mimic an end user transaction are handy.  The test has to be written in Selenium and uploaded to the IBM Cloud as far as I understand.

The screens seem to not always render and refresh reliably, but that could just be an issue with my browser.  My basic free CloudFoundry .NET app seems to be crashing regularly, but I suspect that I have not given it enough memory to run.

If you are an existing IBM customer who is already in the IBM ecosystem, this new capability is worth exploring.  The ability to drill down on the synthetic transaction results and get a waterfall type view of step timings is neat.

One drawback is that this service does not run independently from IBM Cloud hosted apps.  In other words, you cannot use it to monitor just any website, unlike the Microsoft flavour. Unless I have missed something?

Ultimately, try it and see if it's right for you.  It may be worth your time exploring this one. Enjoy.

MB


Links

IBM Cloud
CloudFoundry
Selenium



Most Popular Posts