Part 2: Reflections on Excel to DAX Translation


Introduction

This post is indeed a continuation of Part1: Reflections on Excel to DAX Translation. With any project of this magnitude there are going to be learning moments, frustrations, challenges and triumphs. This post is mostly about the frustrations, which for entirely selfish reasons I just really feel like ranting about. Oh, you thought the first part of this series was a rant? Well, that was the Brady Bunch, this one is Tiger King. So strap in! Here we go!

Excel’s Handling of Negative Binary/Octal/Hexadecimal Numbers is Stupid
Some of Excel’s binary, octal and hexadecimal functions like DEC2BIN, DEC2OCT, etc. support negative numbers. The DAX equivalents I created do not. While I could have coded them to be the same, I decided that the way that Excel handles negative numbers of other bases is stupid and I was not going to step over piles of dead birds just to be sucked down a rabbit hole of sadness in a vain attempt to replicate insanity.

First, you can’t reconcile the way Excel handles negative binary numbers (or any other non-base 10 system) with most other sources on the topic. While I understand the pattern being used, it does not concur with the vast majority of other sources. Second, and more importantly, the entire thought process behind not displaying user friendly negative numbers in other bases is just lame. In Excel logic while 1001 decimal is 1751 octal, -1001 decimal is 7777776027 octal. What? I mean, I don’t personally care how you programmers store negative numbers internally but how about presenting me with -1751 octal? How crazy would that be? So by Excel logic -1001 decimal should be presented to the end user as 9999998998 decimal. Except it’s not because that makes zero sense whatsoever and if you wrote it that way people would look at you funny, shun you, whisper behind your back and think that you were a complete and utter idiot. Not even “new math” and “common core” changed how negative numbers are written so the idiots have already had their go at it.

Hexadecimal is a Stupid Name for Hexadecimal
OK, this one I already knew but I was reminded of the sheer hilarity, absurdity and idiocy of the term “hexadecimal” while coding DEC2HEX. First off, it is hilarious to me that a supposedly “scientific” language, the International scientific vocabulary (ISV) can’t keep its Greek and Latin etymological origins straight and just mix and match them freely. I mean, why not just throw in a little Moabite, Etruscan or Sabaean in while you’re at it? Why be all exclusionary? I guess history, linguistics and etymology don’t make the cut when it comes to science? Because scientists are too ignorant to form words properly? I have to say, it always cracks me up when scientists do pathetically stupid things.

You see, hexa comes from Greek and is the prefix for six. Decimal is from Latin and the word for tenth. So, even if you forgive the abomination of randomly scrunching Greek and Latin words together, you’re still left with a literal translation of six tenths, which is like 0.6 or not even close to 16! So, it’s just wrong all the way around. The proper term for base 16 in Latin is senidenary or sedenary which means grouped by 16. The Greek form is hexadecadic (like a 16 sided hexadecagon). The more modern form is decahexadic. But, any way you slice it, it sure as hex is NOT hexadecimal, which makes no sense unless you are using base 0.6.

Python…
I actually learned Python as part of this project. I’ve written in all types of programming and scripting languages but had never written a lick of Python or even installed it. Why did I learn Python during a project to convert Excel functions to DAX? Well, it twas…Lifebuoy. OK, actually it was the GAMMA function. The math for the Gamma function is not fun. Apparently, Gamma is notoriously hard to integrate, particularly so if you’ve been blinded by soap poisoning.

banner gamma explosion 1.jpg

So, what is a classically trained engineer to do when the math gets hard? You fudge it and resort to numerical methods of course. So that’s what I did. And I got to within 12 decimal places of accuracy. So, good job me. I got there because I happened across a Python implementation of Lanczo’s approximation of Gamma. So I had to install Python and learn Python in order to it translate the code to DAX.

Translating the Python code to DAX was actually pretty cool. It was tricky because there was a recursive element in the code as well as a for loop! Both of those are normally non-starters for DAX but where there is a will there is a way!

The Python code for reference is:

 

from cmath import sin, sqrt, pi, exp

p = [676.5203681218851
    ,-1259.1392167224028
    ,771.32342877765313
    ,-176.61502916214059
    ,12.507343278686905
    ,-0.13857109526572012
    ,9.9843695780195716e-6
    ,1.5056327351493116e-7
    ]

EPSILON = 1e-07
def drop_imag(z):
    if abs(z.imag) 

 

The equivalent DAX code is:

 

GAMMA = 
    VAR __zInput = MAX('Table'[z])
    VAR __p =
        {
            (0, 676.5203681218851),
            (1, -1259.1392167224028),
            (2, 771.32342877765313),
            (3, -176.61502916214059),
            (4, 12.507343278686905),
            (5, -0.13857109526572012),
            (6, 9.9843695780195716e-6),
            (7, 1.5056327351493116e-7)
        }
    VAR __EPSILON = 1e-7
    VAR __z = IF(__zInput 

 

So what do I have to say about Python now that I have added it to my repertoire of coding languages? Well, I have to say that I am, well, I’m…I’m pretty unimpressed actually. I guess with all of the hype around Python and it’s supposedly exponential growth in popularity, I honestly and sincerely do not see the attraction. I mean it’s a programming language and really not at all much different than most of the other languages I know. I mean, you can have variables, and you can loop, and write functions and, well, you can do exactly what you can do in every other general purpose programming language. Yaaaaaaayyyyyy….?

But all the hype. Man, going into this I was told the following were unequivocally and 100% true:

  • Python is much more accessible and easier to learn than other programming languages
  • Python has huge tracts of libraries…
  • Python has big data
  • Python is more understandable and readable than other programming languages
  • Python is FAR more productive, requiring less lines of code

>cracks knuckles


OK folks, let’s do these in order:
Python is much more accessible and easier to learn than other programming languages
“I mean, gee willikers Harvey, I have an IQ of 12 and even I can learn Python.” No…no you cannot. You see, Python is no easier or harder to learn than any other programming language because syntax is the easy part. The hard part is actually learning the concepts of programming. To learn how to think in terms of variables and loops and recursion. That’s the hard part about learning to program, not which language you pick as your first one. Once you learn to program, most other languages are a breeze to pick up unless there is something fundamentally different about the language, like how lists are so central to Lisp that even the program code is made of lists or how context is so central to DAX. New concepts are hard. New syntax is not. So…no.

Python has huge tracts of libraries…
“You can’t believe the options that come with this baby, like the TruCoat, you gotta get the TruCoat or you’ll get the oxidation…” Let’s be clear, this makes zero sense and is clearly not a differentiator. Every programming language…wait, that’s not strong enough…EVERY programming language on the planet that has been around for any length of time has zillions of lines of code in the form of functions, modules and other pre-written packages. Pick one, Perl, R, javascript, C#, you name it. They all have mountains of code libraries, API’s and other things that that you can just pick up and use. So, this is an ignorant comment because it’s all 1’s and 0’s in the end people so…

Python has big data
Ooo wee, I gotta see this! A programming language thats gotz the big data! Scintillating. Clearly no. Let’s use one over-hyped meaningless term to sell something else that we want to over hype. You mean Python has code libraries for accessing data that is stored as shards? See “huge tracts of libraries” above.

Python is more understandable and readable than other programming languages
“My 2 ½ week old learned how to read and understand Python code before she could even lift her head.” Wow, that’s some amazingly readable code. Let’s put this one to the test. Here are standard for loop constructs in a bunch of different languages, including Python:

 

Python
for x in range(0, 3):
  print("value of x: %d" % (x))

C
for( x = 0; x ";
}

 

Wow, yeah, obviously waaaaayyyy more readable, that Python code. You know, because the colon there, that’s way more readable and intuitive than curly brackets. I mean, curly bracket’s? What moron uses curly brackets to encapsulate a block of something…cough…math. And who uses a & or + to concatenate text when “%d” % (x)” is soooooo much more intuitive? Mmmm…hmmm.

image.png

Except…hmmm…now that I look at it, out of the nine for loops listed, the Python code is the ONLY code where it is not absolutely obvious that x never gets to 3. Huh, how about that? I mean, the plain reading of the code “for x in range(0, 3):” surely; to the lay person, means that x gets to 3 because, you know, 3 being “in the range” and all. Except, mysteriously that’s not the case. Apparently 0 is “in range” but somehow 3 is not “in range”. How about that. Amazing. What an odd, unintuitive and dumb construct. Huh. But, in every other language it’s pretty clear that x stays “less than 3”.

 

Honestly, if I’m looking at code readability, I have to tip my hat to Pascal. So, more readable? That’s a big, big negative.

 

Python is FAR more productive, requiring less lines of code

“I tell you Earl, you’ll be so much more productive in Python you can replace those 50 Java developers with just a single Python developer by golly! Yeeee haawwww!!” This is the whopper lie and this one kills me for various reasons.

 

First, is this really the standard we want for what programming language to use, which one has less lines of code? Personally, I feel that this is an incredibly misguided and dangerous concept. I mean, if that’s the standard, what in the world did we ever move away from Perl for? If this were the case, shouldn’t we all be coding Perl, the folks who were so proud of obfuscated Perl code that they held contests to see who could write Perl code in the least number of bytes? Bytes. Bytes I tell you.

Go ahead Python programmers, write an alphametric solver in less than 103 bytes like this Perl code:

 

while($_=$a=pop){/([a-z])/i?map{($t=$a)=~s/$1/$_/g; $a!~/$_/&&push@ARGV,"$t"}0..9:/\b0\d/||eval&&print}

 

Any takers? No? Wait, what’s that? Snake got your tongue? Yeah, that’s what I thought. All talk and no code. Just FYI by the way that code comes by way of Michael Woodhams and will take something like send+more==money and solve it to 9567+1085==10652.

 

So, that argument holds no water. But, even if, even if you could, that’s the criteria for why Python is better? You can write fewer lines of code… Or is it the same amount of code but you imported a library? Which, if that’s the case, see “Python has huge tracts of libraries” above. Earth to Python, you didn’t invent reusable code. Functions have been around for a long time people. And less lines of code is not necessarily a good thing. 

 

OK, another reason in a similar vein. Look at the 9 for loops above. Every one of them can be written as a single line of code. So then if we write those for loops as a single line of code, did we just double our productivity? Clearly we did not. So what is this nonsense about productivity and lines of code?

 

Oh, and just one more tid bit. If you look at the Gamma code above, the Python code is 32 lines, the DAX code is 30 lines…

 

Sooooo… Yeaahhhh…

 

Final thoughts on Python
Look, I’m going to clear up the reason behind why Python is so popular. It’s not exactly a mystery and it has absolutely nothing to do with any supposed “benefits”. It is popular for the sole reason that Google seemed to inexplicably glom on to it in 2006. That’s it. Nobody had ever heard of it or cared about it until then despite the fact that it had been around for 15 years.

 

Now, I don’t know for sure, but I’m guessing the adoption by Google went something along the lines of the following:

Some noob programmer Python fanboy at Google that didn’t know anything other than Python got put in charge of something way above their pay grade and this noob programmer Python fanboy was all like “Hey, we’re Google and I’m super cool and edgy and doing something way above my pay grade and we’re like going to do stuff different and stuff and not be all like controlled by ‘the man’ and stuff because I’m a rebel and edgy and stuff. So we’re going to use this brand new programming language called Python because I’m too dumb to know it’s been around for 15 years and nobody uses it because, you know, we’re Google and we’re like super cool and hip and different and not at all evil because, you know, it says so in our corporate motto…and stuff”.

 

And then, everyone else on the team was like: “Right on, we’re Google and we’re also super cool and hip and stuff and we’re going to do something different because we’re also rebels and for no other reason than we can. Middle fingers to ‘the man’…and stuff. So we’re going to use this new cool programming language called Python because we’re also too dumb to know that it’s been around for 15 years and nobody uses it or cares about it.”

 

And thus, Google adopted Python. And, since Google adopted Python, other tech folks took notice and wondered what this new, hip, super cool, trendy programming language was that they were missing out on because THEY were also too dumb to know that Python had been around for 15 years and nobody cared about it. And there you have it. Which, by the by, is exactly the same reason why cancer sticks became popular. So…good job Google hipster Python fanboy?

On the plus side though, Python is absolutely, god-awfully dog slow and the only language in existence that breaks because you don’t indent things. So…it’s got that going for it. Honestly though, I’m not just bagging on Python here. I pretty much feel this way about most languages at this point. I mean, there must be nearly a thousand programming languages listed on Wikipedia. Guess what? They all reduce to binary code at some point so…can anyone say redundant? Gee, I wonder how big our “tracts of libraries” would be if we were all coding the same programming language for the last few decades…

 

Conclusion
Negative binary numbers are stupid, hexadecimal is stupid, Python is stupid, I’m apparently old and you darn kids need to get off my lawn…

 

Stay tuned for Part 3!!



Source link

Be the first to comment

Leave a Reply

Your email address will not be published.


*