Skip to content

Why R is not likely to go down soon

December 21, 2019

I know why Python is becoming more popular than R. The syntax can be grasp by kids. Whereas R needs a professional computer scientist’s mind to appreciate it. It is functional programming friendly and that demands thinking out of what is obvious box.

Also, I get that the switch to Python promises a hopeful career to the IT professional.

However, for old folks like me who is no longer after a career uplift, I will stick to R if not professionally at least privately because it is the RIGHT language to use.

Here is a study why R is not likely going to disappear so soon, at least those who are doing real data science – those in the health sciences.





R for Everything

November 5, 2019


All over the internet and in ML/AI forums, we see the constant debate. I must admit, there are slightly more Python fans than R.  I know why people are switching – it is for jobs! For economic reasons. I get that. But for an old timer like me, I am more after fun and enjoyment not just income. Anyway, who likes to change every couple of years whenever a new thing comes along – I get tired with this.

Python’s syntax is easier to deal with, I get that too. I admit R takes some time to deal with but it is not the syntax, it is the functional paradigm that it exposes the new comer of which she has no inkling of what the functional programming paradigm happens to be. Due to its fan base, a lot of myths are being propagated by their proponents. I for one, will not be switching anytime soon. Because I believe I got what I need in R. Here are some examples of the myths, and most who believe these myths have very little knowledge of computing theory.

Python is more powerful than R? Like, you can write algorithms that you cannot write in R. Huh? Both Python and R are Turing Complete (TC) languages! TC means any algorithm you can think of you  can express it in either languages. You are just limited by your imagination. In fact, you can even do it low verbosity in R than in Python, because R is applicative programming (functional). So if you are looping in R using the for or while loop then you are doing it wrong. In R, you recur, you apply, you filter, you reduce.

I admit, to the non-professional programmer Python is easy – for after all it is like BASIC language of many many years ago.  Between the two, Python is context sensitive while R is context free (it is after all inspired by Scheme which came from LISP).  What does that mean? It means with Python, the programmer has to be conscious of the use of white spaces. For example the Python coder has to pay attention on his indentation each time he places a line specially if she is doing a code block. This is not true in R. The {} marks a code block and when you see that, that is immediately a signal you got a set of code to be treated as a unit. In context free, the white spaces are non events, but in context sensitive languages, that white space is bad trouble.

With Python, you do not need to comment code? This is another fake news. Once you read a Python code without comments and that code is using libraries from all over the place, good luck to you when you are asked to modify or debug that code. I hope you stay safe and not break any piece there.  You also won’t be working for me if you love no comments. It is extreme folly to think the code is the comment. This is another hype made out by fans which normally do not come from computer science back ground.

Why I am staying put with R? Python started out as a general purpose language which moved into ML. Yet, R which started in statistics is actually now moving to being general purpose as well. It is a matter of facilities.

Can you write command line app in R? Yes.

Can you write a desk app in R? Yes.

Can you write data manipulation, data cleaning, etc in R? Yes, without resorting to PERL.

Can you write web scrapers in R? Yes.

Can you write a web apps in R? Yes.

Can you consume or expose web services in R? Yes.

Can you write job control apps in R? Yes.

Have I left off anything? Yes.

Can you write games in it? Believe me – yes.

If not, then it is good for everything then.



Techno-hype why old is new again

September 6, 2019

Somewhere somehow, when something is new, it must be good or better than the one that came before. I don’t believe it.  Hype  makes  the  world  go  round. People feed on this. Frankly, I have no more appetite to learn another programming language.

What they put in to enhance a programming language, they learned it from Lisp.

They don’t like its syntax because it is tracking close to its inspiration, the Lambda Calculus of Alonso Church. But that is why it is so elegant and concise.

In functional capable languages, if you are doing loops and not doing apply, recursion, map, reduce, you are doing it wrong. I was saying to an inquirer asking me about R (which came from Scheme), if you are looping, and not recurring or applying, then you are not using the power of the language, those facilities make your code smaller than necessary and so much easier to debug.

This is one language I believe every software engineer should go for before they pass on to the …



By default, Statistics is not “intelligent”

January 21, 2019

Now please do not get upset. I am using the word “intelligent” the way computer scientists used the word some 60 decades ago. According to the late John McCarthy, the man who coined the words “artificial intelligence”, something is intelligent if the ‘thing’ has common sense.

We shall therefore say that a program has common sense if it automatically deduces for itself a sufficiently wide class of immediate consequences of anything of anything it is told and what it already know — John McCarthy.

The fact that in statistics we have to get a sample of size n and it has to be large as large as we can afford with our resources, makes Statistics not so intelligent, if we follow McCarthy’s definition. The Central Limit Theorem(CTL) is the saving grace of Statistics, at least the classical one. In classical statisics, before we proceed to use it, we must first go behind the probability distribution f(X) that governs the values of our random variable X under study. This is not usually known, but CTL says, don’t lose hope, forget X just get a lot of samples and consider \bar{X}, just get lots and lots of examples of X and form its \bar{X}, at least n >30 because we know that \bar{X} is going to be Normal. However, when someone has common sense, you do not have to give it tons and tons of data for it to know what you are saying, so by computer science standard, Statistics is not intelligent 😉

CBMLs is the way to go

August 23, 2018

I think Cloud Based Machine Learning(CBML) is the way of the future. The reason I say this is because it combines the need for big data with machine learning processing. You have it all on one platform. The more I experiment using a CBML the more I appreciate the convenience it affords. The nice thing too is that there is a graphical workflow one can use to conceptualize the steps in your experiment. Very nice.

Making AI Great Again

July 4, 2018

This is my own version of MAGA. To appear at the IJCCI 2018, Seville, Spain.

The Humanities of Maths/Computer Science

May 8, 2018

Last month I took the exercise of tracking the history of the term AI. In the process I found myself meandering towards some of the histories of maths that govern AI’s present state of the art.

Most scientists believe that mathematics is part of the Sciences. I do not hold such a view, rather, I do consider it really a part of Philosophy, so still part of Humanities. In going through the history of mathematical structures present in computer science, I am struck at the thought of how mathematics is a deeply human enterprise. I know this is obvious to some, and though this truth is somewhat present in a small way at the back of my mind, it is only now that I am dealing with it personally. This personal exercise brought this truth home to me in a profound way and I am delighted in that discovery.

I am amazed how the use of history can inform how one does his/her maths. In my examination, history can tell us what a mathematician or computer scientist went through. We can learn the trial and errors they went through and their character of tenacity in “keeping the faith”.  An example I can recommend worth studying is the struggle that Geoffrey Hinton went through on how he became the “grandfather of deep learning”. It was not only the story about the maths of neural network but also the story of how one’s passion and faith in an idea is benefitting society today.