Monday, November 20, 2017

Updating some data about our data for our team

One of the techniques we use to develop new features (which I cannot talk about) is wrapping the new code behind what we call a Feature Flag.  A feature flag is just a variable that we set OFF while we are working on features to keep that code from running until we are ready to turn it ON.  This is relatively basic engineering and there really is not anything special about it.

As a related note, many Windows applications use a registry key to turn features on or off.  We use a text file here with other data about the flag stored in it. For example, we not only store the name of the flag, but the name of the team that owns it, a short description of what it is for and when we expect to be done.

In some cases, those dates can be wrong or the name of the flag needs to be changed to make its purpose more clear.  For instance, imagine this contrived example. A flag named "tooltip" is not all that useful, but a flag named "ShowAdvancedAnalyticsTooltipsForTheWebClients" is a bit more explanatory.  And if work finishes early or lags behind estimated dates, those dates can change as well.

So this week I expect to update these values (metadata) for our flags.  This is very low priority work but with US holidays approaching, now seems like the best time to get this knocked off the to do list.

Questions, comments, concerns and criticisms always welcome,

Wednesday, October 25, 2017

Don't bother porting dead code

I have a task on me to move (we call it "port") some test code from one location to another.  The details are not interesting, but it did involve moving a test "Shape" object and a "Text" object.

The text object and the shape object both inherited from the same parent class, included the same set of header files, were co-located in the same directory within the project I wanted to modify and were otherwise similar in structure.  For a variety of reasons, though, the text object could be moved without much effort at all.  The shape object proved far more difficult.

At first, the compiler complained it could not find the header file for shape.h.  That was a little tricky, but it boiled down to having 2 files named shape.h in the project, and the path to the file I wanted was not specified correctly.  Fixing that caused the shape object not to be able to inherit from its parent class.

And thus began about 2 weeks of trying to get the darn code to build.  I would find and fix one problem only to get to the next error.  This is not unusual - we call it peeling the onion - but it is time consuming. 

For my needs, this is a medium priority task at best, so I wasn't working on it full time.  Just when I could fit it into my schedule for an hour or so.  I started with 27 build errors, built that up to about 100, then whittled it all down to 2.

But at this point I was 2 weeks into build files, linking errors, etc… and decided to try a new approach since I felt I was treating symptoms and not the underlying problem.

I put everything back where it was (reverted my changes, so to speak) and rebuilt.  I then stepped through the code to see how the Shape object was being allocated and used.

It wasn't.

Although it is referenced in the tests, it was never used.  It was dead code.


I was able to delete the dead code, move everything else I needed and got unblocked.

Lesson learned - do your investigation early in the process to determine what actually needs to be ported!

Now, off to a 2 week hiatus.  See you when I am back!

Questions, comments, concerns and criticisms always welcome,

Friday, October 20, 2017

Back from Tableau Conference 2017

What a whirlwind that was.  I started the week help with folks getting their schedules straight, then did a little docent work with helping people find rooms.  I also was crowd control for seating during the keynotes (which were a blast! Adam Savage was pretty terrific) and even got to do a little security work in there.

Overall, this was a fantastic conference.  I got to meet several of our customers 1 on 1 and gained some tremendous insights into what everyone wants from us.  That alone made the conference worthwhile for me - now I have a much better idea where I need to focus my time and test efforts moving forward.

If you come next year in New Orleans, be sure to let me know!  I'd love to spend some time chatting with any Tableau users that happen to be reading this blog!

Questions, comments, concerns and criticisms always welcome,

PS: Yes, someone mentioned that it seemed like everywhere we walked we always walked through the casino.  I just chuckled and mentioned that means that whoever designed the walkways did the job right...

Tuesday, October 3, 2017

I so badly want to rewrite this little bit of test code, but won't

I saw something similar to this in a test script:

#fdef Capybara:
#include "A.h"
#include "B.h"
#include "B.h"

Fair enough.  I can imagine that this was NOT the way this conde snippet was checked in - it probably changed over time to what it is now.  I haven't yet dug into the history of it.

That seems a bit inefficient to me and I would prefer to change it to:

#fdef Capybara:
#include "A.h"
#include "B.h"

Fewer lines and should perform the exact same includes. 

But I won't make this change and here is why:
  1. It may break.  The odds are small, but why add risk where the system is currently working?
  2. It's a small benefit overall.  Maybe 1/1000 of a second faster build times.
  3. It takes some amount of time to make the change, build, test it, get a code review, check it in, etc…  I can't see any time savings for this change in the long run.
  4. A better fix than this one change would be a tool looking for this pattern in ALL our files.  Then use the tool to make the change everywhere, and make the tool part of the regular code checks we perform constantly.

But considering #2 overall, even a tool would not pay for itself.  So I am leaving this as is for now and moving on to my next task.

(But if I have the fortune of needing to edit that file, I will be sorely tempted to make this fix at the same time :) )

Questions, comments, concerns and criticisms always welcome,

Monday, September 25, 2017

Tableau Conference is 2 weeks away and I am getting ready

I got my schedule set for TC which is now just around the corner.  I will be working as a Schedule Scout, helping folks that attend get their schedules created.  Seems like a great way to get a conversation started face to face with our customers.

For TC last year, I had the same goal of talking directly with customers.  I looked around at all the jobs we can do - at TC, ALL the jobs are performed by Tableau employees - and figured that working at the logo store would be ideal.  My thought was that I would meet a large number of customers, and I did!  The only downside was that the store was very busy and I did not have much time to interact with everyone.

This year it looks like I will get a chance to work 1:1 with people and I am looking forward to it.  I hope to meet you in Vegas!

Questions, comments, concerns and criticisms always welcome,

Tuesday, September 19, 2017

Working with code coverage this week

I've been working on code coverage a bit this week.  Our team uses Bullseye for gathering information and while it has its good and bad points, overall it is fairly easy to use manually. 

More specifically, I have been trying to identify files that have no automated coverage at all.  This would mean that this file could, in theory, not even exist, and we would not know.  The reality is that Tableau would not be able to be built, but not having any automated coverage is a poor state to be in.

And this is where I hit my first snag.  We have several different types of automation we run each day.  Unit tests is an obvious starting point, and there are also end to end (integration) tests.  Code coverage numbers for those are easy enough to gather and to merge together.

We also run other types of tests like security tests and performance tests.  Getting numbers from something like a performance test is tricky.  Since we are trying to measure accurate performance, we don't want to also slow down the system by trying to monitor which lines of code are being hit or missed.  On the other hand, the code being hit should be the exact same code the other tests already cover - in other words, we should not have special code that only runs when we try to measure performance.  It's hard to validate that assumption when we specifically don't want to measure code coverage for half of the equation.

In any event, we do have a few files with low coverage, and we are working to ensure that code has adequate automated tests on it moving forward.  More on this - what adequate means in this context - coming up.

Questions, comments, concerns and criticisms always welcome,

Friday, September 8, 2017

How I learned Tableau

Way back when I came to Tableau, I needed to ramp up on how Tableau works.  While I like to think I understand much of the math behind the scenes, I still needed to figure out the UI, connecting to data sources and mapping and functionality like that which Tableau provides.

One of my buddies from way back was talking with me earlier this week and had the same dilemma I faced: how to learn Tableau relatively quickly.

When I started, I bought the Offical Tableau 9 book by George Peck.  Great book - it starts with the basics and builds from there.  It's about 2 years out of data at this point (there is a 10.0 book available now), but I still use it from time to time to refresh my memory.

But books only go so far, and I really learn best with hands on work.  I found a "20 days to learn Tableau" chart that I also used.  It really resonated with me - it had videos to watch, whitepapers to read (I actually found a typo in one of the papers, reported it to the author here at Tableau, and it got fixed) and activities to complete.  I recommended it to my friend and I hope he gets as much out of it as I have.

Questions, comments, concerns and criticisms always welcome,