Wednesday, October 17, 2018

A nice blog post from Andy Kriebel provides a nice break from TC preparations

As you can tell from my dearth of recent posts, I have been incredibly busy with getting prepared for Tableau Conference.  Hard to believe it is coming up next week!

While hammering out the details of the Hands On Session I will be co-presenting, I saw that Andy Kriebel has just uploaded his 1000th blog post over at vizwiz.com.

Andy, who I have not met, is one of our Zen masters and has been a tremendous advocate for Tableau for years.  This post is a sincere homage to Tableau Public (free!) and I look forward to the next 1000 posts Andy writes.

Back to TC prep.

Questions, comments, concerns and criticisms always welcome,
John

Monday, September 17, 2018

Matlab + Tableau - making math much easier!

A quick note that Tableau can be set to use Matlab as an external server - like R or Python.

There is an article from Mathworks here: https://www.mathworks.com/products/reference-architectures/tableau.html

Since my team also owns this feature I figured I would point it out.  I've always liked Matlab - it is very easy to look at some equations and then "just type them into Matlab."  Thanks for helping us out along the way, Mathworks!

Questions, comments, concerns and criticisms always welcome,
John

TC is getting closer

I mentioned I would be co-hosting a Hands On Training session at Tableau Conference this year.  That has been keeping me very busy this past month. 

The session I am co - presenting is Intermediate Statistics.  It features many of the areas my team here owns.  To that end, I'm gathering data sources to provide some meaningful training on Trend Lines, Forecasting, Clustering and correlation.  <Insert "correlation is not causation" warning here>.

Surprisingly, there are not many ready made data sets that focus on intermediate level analysis.  Most are either introductory and too basic for this type of hands on training, or are very large datasets that can be overwhelming.

I don't want to tip my hand here yet since I am still working on getting a few data sets nailed down, but I think I have the general idea in place.  It's simply a time consuming process to get all the details hammered out in advance.

Then on to practicing my speaking!

Questions, comments, concerns and criticisms always welcome,
John

Thursday, August 16, 2018

Getting caught up here and in my day job!


I just noticed that I haven't met my self imposed goal on updating this blog every week.  I've been locked down with so much day to day work this simply fell off the radar.  Instead of any insight (that's coming up :)  ) I'll share what I have been doing.



  1. Filling holes in test coverage.  I had time to look at our code coverage numbers and found some holes in our coverage.  I have added some tests to validate the code continues to do exactly what it was designed to do.
  2. Identifying holes in coverage.  As we get ready to send features out to our customers, we review the test plan we have in place to ensure we did not let any test cases slip through the cracks.  We find some very small gaps in our coverage when we did this exercise last time, so I added some tasks to our to do list and have been helping to get them done.
  3. Had some good internal training on hiring practices and other activities like that.
  4. Many (many!) discussions with partner teams about adding cross team features in the future.
  5. Preparing for Tableau conference (I wrote about this last time).
  6. And then simply doing my "regular" job has been very time consuming recently.



Looking ahead, I have been thinking about this situation.  If you give an artist a rake as the only implement to use to make some art, you had better expect to see some leaves.  I'll dive into that comment next up.



Questions, comments, concerns and criticisms always welcome,

John

Thursday, August 2, 2018

I'll be speaking at Tableau Conference this year


Just got my confirmation that I will be co-speaking a Hands On Training for Basic Statistics in Tableau.  This should be a great experience - our team owns all the code that drives the features we will be teaching.  And since I am a tester, if there are any glitches, I guess I can only look in the mirror for who to contact about them (heh).

More about Tableau Conference here: https://tc18.tableau.com/

Now I need to go prepare the topics which we will present!


Questions, comments, concerns and criticisms always welcome,
John

Monday, July 9, 2018

The Guin Constant

I was reading James Apnes Notes on Discrete Mathematics  this last week and came across this statement in section 2.1.5:
"The natural numbers N. These are defined using the Peano axioms,
and if all you want to do is count, add, and multiply, you don’t need
much else. (If you want to subtract, things get messy.)"

Since the book is aimed at Computer Science majors at Yale, I was bothered by this statement right off the bat.  Computers are pretty good at some aspects of mathematics, but not all.

In this case, I can't use a computer to count.  Eventually, my computer will run out of memory. When I try to add one more to the current number, my computer will crash.  I know what you are thinking: "I'll just add more memory!"  (It's turtles all the way down).  That won't work either since memory is made out of matter, and the universe in which we live has some finite amount of matter.

So suppose we used each bit of matter (atoms, leptons, whatever smaller building block we discover next) 100% efficiently to store the biggest number we possibly can store in this universe.

I call that the Guin Constant.   It is the biggest number this universe can hold precisely using all matter to store it.

And please don't add one to it.  I'd hate to crash this universe.

Questions, comments, concerns and criticisms always welcome,
John

PS: apologies to mathematicians if this constant already exists.  It's hard to find information like this with internet searches and/or the math books within arms reach of me right now.


Friday, July 6, 2018

Moving day for tests


One of the tasks on my plate is breaking up our old (old!) test suite.  We have a set of tests that live in this suite and without getting into the gory details, they need to be moved out of that project.  In short, that project is old and needs everything to be built to be used for testing.  Since unit tests are designed NOT to need everything built - just the bit of code I am trying to test - this is not a good paradigm to follow long term.

So we have started to break up these old tests and move them to the best location (module) into which they belong.  It's going well so far  but as some tests are really old they become troublesome and time consuming to move.

In the process, I have learned more about Cmake than I ever wanted to know!  As you can imagine, if a test is moved from its original location and into a new location, the file that directed it to be built in the old location needs to have that entry removed.  And the file that instructs the compiler for the new location needs to be updated to include the new file as well.

So in the best case, a single move has to update three files.  I haven't hit that best case yet - there are always more updates  needed - but I am getting better at it each day.  If you are interested, let me know and I can post more details.


Questions, comments, concerns and criticisms always welcome,
John