Tuesday, September 19, 2017

Working with code coverage this week


I've been working on code coverage a bit this week.  Our team uses Bullseye for gathering information and while it has its good and bad points, overall it is fairly easy to use manually. 

More specifically, I have been trying to identify files that have no automated coverage at all.  This would mean that this file could, in theory, not even exist, and we would not know.  The reality is that Tableau would not be able to be built, but not having any automated coverage is a poor state to be in.

And this is where I hit my first snag.  We have several different types of automation we run each day.  Unit tests is an obvious starting point, and there are also end to end (integration) tests.  Code coverage numbers for those are easy enough to gather and to merge together.

We also run other types of tests like security tests and performance tests.  Getting numbers from something like a performance test is tricky.  Since we are trying to measure accurate performance, we don't want to also slow down the system by trying to monitor which lines of code are being hit or missed.  On the other hand, the code being hit should be the exact same code the other tests already cover - in other words, we should not have special code that only runs when we try to measure performance.  It's hard to validate that assumption when we specifically don't want to measure code coverage for half of the equation.

In any event, we do have a few files with low coverage, and we are working to ensure that code has adequate automated tests on it moving forward.  More on this - what adequate means in this context - coming up.

Questions, comments, concerns and criticisms always welcome,
John

Friday, September 8, 2017

How I learned Tableau


Way back when I came to Tableau, I needed to ramp up on how Tableau works.  While I like to think I understand much of the math behind the scenes, I still needed to figure out the UI, connecting to data sources and mapping and functionality like that which Tableau provides.

One of my buddies from way back was talking with me earlier this week and had the same dilemma I faced: how to learn Tableau relatively quickly.

When I started, I bought the Offical Tableau 9 book by George Peck.  Great book - it starts with the basics and builds from there.  It's about 2 years out of data at this point (there is a 10.0 book available now), but I still use it from time to time to refresh my memory.

But books only go so far, and I really learn best with hands on work.  I found a "20 days to learn Tableau" chart that I also used.  It really resonated with me - it had videos to watch, whitepapers to read (I actually found a typo in one of the papers, reported it to the author here at Tableau, and it got fixed) and activities to complete.  I recommended it to my friend and I hope he gets as much out of it as I have.

Questions, comments, concerns and criticisms always welcome,
John

Tuesday, August 29, 2017

Stochastic Processes


I'm a big believer in online classes to brush up old skills or develop new ones.  I've taken several over the past few years and while there are some classes that don't live up to the expectations, I found one that has been a pretty fun course so far.

It's called Stochastic Processes: Data Analysis and Computer Simulation from Kyoto University in Japan.  Here's a link to the class on Edx.  It is self paced and wraps up Aug 2, 2018.  It is divided into 6 weeks of classes and each week is expected to take 2-3 hours per week to complete.

I have spent MUCH more than that simply wiping the rust off of my physics classes from college.  Don't get me wrong - I love the experience of going back to Albert Einstein's PhD thesis on Brownian Motion to help with that chapter.  I understood just enough of his paper to make the simulation more understandable, and having an excuse to read anything by Einstein is just icing on the cake.

And there has been a lot of that.  I had to keep looking up mathematical concepts I haven't used in a long time (Dirac Delta equation, for instance).  Again, this was worthwhile.

If you get a chance and know Python and meet the other requirements, this may be an interesting class to take.  Plus, to audit the class is free, and you can't beat that price!

Comments, questions, concerns and criticisms always welcome,
John

Tuesday, August 15, 2017

I'm waiting for this book about using Tableau with Matlab


Last week I wrote a bit about the Matlab support with Tableau.  Our team also owns R support (it is all built on a REST API) and many people have been using R for years with Tableau.  Talks about R integration are a very popular topic at Tableau Conference each year as well, so there has been a tremendous amount of interest in this area.



So much interest, in fact, that Jen Stirrup has written a new book that is due out pretty soon, Advanced Analytics with R and Tableau. It will be available in paperback and ebook formats and is due out on September 6.  That is less than a month away as I write this and I am looking forward to getting my copy.



Good luck, Jen.  I hope you sell many copies of this book!



Questions, comments, concerns and criticisms always welcome,

John

Monday, August 7, 2017

Matlab and Tableau!


Kudos to the folks at Mathworks for their efforts to bring Matlab into the Tableau world!  Details about this are here.  Nice job, gang!

As for testing, this is one of my team's areas.  We also own R and Python integration, so we had test plans for this area well established.  Mathworks was so good at their implementation that there was frankly not much for us to do - a couple of string change requests and that was really about it.  We added some automated tests to validate the behavior is correct - to tell us if we change something that would break using Matlab server - and have that up and running now.  The side benefit to the automation is that we have no manual testing left from this effort.  This means that we are not slowed down at all in the long term even though we have added new functionality.  From a test point of view, this is the ideal case.

We never want to build up manual test cases over time.  That growth, if there is any, will always eventually add up to more time than the test team has to complete the tasking.  Obviously, this doesn't work in the long term so we have made a concerted effort to hit 100% of our test cases being automated.

So, yay us!

And thanks again to Mathworks. FWIW, I truly like Matlab.  It is every easy to look at some mathematical equation and simply type it into Matlab - it almost always works the very time I try it. 

Questions, comments, concerns and criticisms always welcome,
John

Thursday, August 3, 2017

Cartographies of Time - a mini-review


We have an internal library here at Tableau and like any library, we can check out books to read or study.  We had the same setup at Microsoft as well, with a heavy emphasis on technical books.  Any computer company will have books on programming habits, design patters, Agile and other fields like this.

Tableau also has a large section on data visualizations.  The whole spectrum is covered here from books on how to efficiently write a graphics routine to how to best present data on screen in human readable form. 

A new book arrived this last week called Cartographies of Time and it is a history of the timeline.  I saw it on the shelf and grabbed it since I am a fan of medieval maps and the cover has a map in that style on it.  It is a fascinating book that covers the very first attempts at timelines and brings us up to the modern day.

The most striking aspect of this so far - I've not gotten too far into the book - is the sheer artistic skill of the early timelines.  The people that created those timelines worked very hard to get a vibrant image, a workable color scheme and a tremendous amount of data all put into one chart.  It is simply amazing to see this and if you have the opportunity I recommend picking up a copy of this book for yourself.

Questions, comments, concerns and criticisms always welcome,
John

Thursday, July 27, 2017

A brief aside about data at the Tour de France


I'm a bike race fan and I really enjoy watching the stage races like the Tour de France.  The colors, speed and racing is just a great spectacle.

One of the teams that was there this year is Dimension Data.  They use Tableau to analyze the TONS of data they get on the riders and I read, re-read and read again this article on how they do it: https://www.dcrainmaker.com/2017/07/tour-de-france-behind-the-scenes-how-dimension-data-rider-live-tracking-works.html

Now, if I can just get myself invited along on a race to help them with Tableau…

And congratulations to Edvald Boasson Hagen!

Questions, comments, concerns and criticisms always welcome,
John