Michael Williams

This conversation is closed.

A Universal Programing language, or a language that can be universal.

when we thought of computers a decade ago, you thought of the matrix and the universally understood binary visuals of 101011011010110. But as time progressed and more advancements took place, we have gone from the binary to many other forms such as VB, Html, java, C, API and many other custom software languages, but really they are all saying the same things, just in different ways or words.

Why not have a language that defines these terms from all languages and creates a universal functionality that allows for easier meshing of different softwares to create large diverse programs?

Is this already happening, in the works or just a dream of the future?

Ideas, thoughts, comments?

  • Jun 4 2013: My interpretation of the question is - when is the proliferation of computer languages and hand built solutions going to solidify into a standard toolkit with a building code (yes - like building codes for building buildings)
    Not for a long time. By a long time I mean this time next century or so.
    Look at the timeline for computer languages - the pace of their creation is not slowing down. If anything, it is speeding up.
    There are still fundamental research to be done in hardware and software development with new wild cards being added all the time (Quantum computing for example).
    In my 37 years as a computer programmer/engineer/manager/consultant/designer/architect I have personally learned and programmed in 28 different compiled languages and yet we (the industry) continues to re-invent itself every 18 months or so.
    That is the primary reason there are so few little old programmers around. How many times can you re-learn you knowledge base if it recycles every 18 months. No other industry lacks a solid base of knowledge that you can maintain and build on. After a few turns, people burn out and go elsewhere.
    Kids remain fascinated by the pace and so it remains a youth oriented culture without the benefit of journeyman's experience in solving problems - because no one stays around long enough to become a journeyman.
    Its not happening and it won't happen for a long time (perhaps in pockets like gaming but not generally) until the industry grows up.
    • thumb
      Jun 13 2013: So then how about we merge the creation of a universal Programing language with the Applications of a Software AI Algorithm made to be open sourced in a way that allows for constant programing language input ( like an anti virus does for definitions ) then inside this software you are able to type as if you were to me and you now, and have it translated to any programing language or an all in one.

      Like a programing language babelfish, just more geared towards the common usage of written/typed language converted to programing to allow for use by anyone, as well by any platform being it can be read by any language.

      I understand the Programing languages change due to various things, but with so does ours in every language, and we adapt to it rather well.
  • Jun 1 2013: I am an old computer programmer in the sense that I started to write programs on the first generation of the IBM computers. Then the program languages are the so-called high level languages, such as Fortran and Basic. These prog languages were more like speaking languages. However in order for the computer to run the prog, it has to be "compiled" into the binary codes to be accepted for running. I agree with George that it was impossible to convert the binary codes back to the Fortran or Basic languages, because I think that the RAM addresses used by the binary language may depend on a particular data set. Personally I believe that a language like C could be used as a reference set, but I am not sure because as the as the code unit size changes from 16, 32 to 64, etc, the relationship of the prog language to binary codes must change too.
    I hope what I said still makes sense to you guys because I am just only an ancient programmer in this.
  • Jun 1 2013: Another thing that might fit to your proposal. It is often the case that several different languages are translated into the same intermediate level language. As in there are several languages that are translated to Java Bytecode, which is a language that can be understood by the virtual machine executing it.

    So maybe this comes close to your idea but just doesnt cover all languages?
  • Jun 1 2013: Maybe there is a universal programming language. A few ideas come to mind:

    1. One could think of boolean logic as the underlying language of all computers.

    2. Turing machines are a model for universal computation. Everything any computer, no matter what kind of programming language you use, including future ones, cannot do anything that a turing machine cannot do.

    Turing machines can even be implemented physically. However, if you try to implement your next android app on a turing machine, you will soon discover that other formalisms might be more handy to express actual programs.

    To answer your question, every programming language (except very focussed, specialized ones) are said to be Turing-complete which means: they can all do what a Turing machine can. No more, no less.

    This leads to the insight that programming languages are more like tools to express algorithms. Which language is most suitable to use depends on the individual application. You might find that your hardware drivers should be developed say C, while your web server using Java (or whatever).

    This leads us to another point: There is often a hierarchy of languages. Higher level languages let you express your problem easily, the code being translated to a lower level language. And so on.

    More practically, I do not expect a unification of programming languages to a common one. Application domains are too different, (One user wants to hack a quick mobile app, others want to develop software that is used for the next space mission control).
  • May 31 2013: Hey Michael,
    you mean, a sort of Esperanto for the IT world? Perhaps the construction of a universally understood programming language is as difficult to standardize, as a constructed universal language for communication...?
    • thumb
      May 31 2013: Sorta, I feel we could use what we already know, the basics of programing, and use it as a foundation and framing for the construction of this universal language, it's always difficult to standardize something created after the need for application has occurred, using a conceptions of programing as a foundation could allow for an easier application and construction.
      • Jun 1 2013: It seems to me, although I am a lay-person, that JAVA and HTML, for example, have a lot of similarities as it is. Sounds like an achievable plan, Michael!
  • May 31 2013: Binary is the fist universal language. It is language simplified to it's most fundamental state. I'm afraid that this is the language that processors will always use and that programming languages will always be compromised by the demands of machine language to involve conventionalizations which try to meet us half way. If Barry is right I hope they hurry up before I die because coding has always been a major hurdle for me.
    • thumb
      May 31 2013: I want to use binary as a basic reference point like a dictionary sort of so this tech could analyze and process a language of one form to this universal language allowing for other concepts to take place.
  • May 31 2013: In the not too distant future we will be "programming" computers with natural language.
    • thumb
      May 31 2013: Exactly on the same line of thought as I am speaking of, just better worded. I feel if we can over come this many language ( in code ) issue, we can create a software that used this tech and let users code from speech.
    • Jun 1 2013: I don't mean to disappoint you but I remember I hear that in the early 1990's... I've been around for more than 20 years and the "not too distant future" still doesn't come. It is not impossible however extremely difficult, so I don't think a thing like that can be available to the general public in less than 15 years.
      • Jun 1 2013: I think your estimate is plausible. I was involved with computer systems for over 30 years, and the most useless question I was ever asked was, "When will it be done?"
        • Jun 3 2013: Well... if "When it will be done?" is a useless question then don't try to answer it, just keep telling the world: "in the not too distant future" for another... what? 20 years more?, maybe in the meanwhile, the holly spirit would get down to earth and build it for you, while keep on dreaming arms crossed... no offence, but the way you talk makes obvious that you have never worked in a similar or related project, not even tried to figure out a plausible algorithm, otherwise you would be a little more humble.

          If "When it will be done?" wasn't a useful question, architects will never start their buildings engineers would never start to make anything, projects off all kinds would simply be abandoned before start, so if you mean to finish something "When it will be done?" is a fundamental question, because it generates commitment and creates a sense of value.

          A scientist (or for that matter any professional) who answers "is a useless question" when asked "when it will be done?" is indeed dodging his/her responsibility, he/she means: "I'm not really committed to make it work".... o even worst: "I'm too dumb to make it work".
      • Jun 3 2013: I think you misunderstood my point.

        There are industries that can produce fairly reliable project completion dates, but in my experience software is not one of them.

        You made an estimate of 15 years. Just how useful is that estimate? Can i rely on it? Can I make further plans and investments, relying that in 2028 I can program computers with natural language? Should I commit to hiring technical programmers for the next 15 years, assuming that no one will provide that capability before 2028? I don't think so.

        In my experience, no one can predict the future reliably, and trying to do so with any specificity is useless.
        • Jun 3 2013: Then I don't understand you previous comment otherwise than believing that you didn't read carefully.

          First: I think 20 years (and counting) is far beyond any "fairly reliable date", and certainly does not qualify as "not too distant future" (at least for me).

          Second: I did not said: "it will be ready in 15 years from today: May 31, 2013"... what I said and meant (please read carefully) was: "it WON'T be ready in less than 15 years", it clearly means any amount of years between 16 and infinity... I really think that does not qualifies even as an "approximate forecast".

          Third: Go ahead and ask anyone around you: "When I say: -not too distant future- how many years I am talking about?", tell me the answers you got and let's see who is the one here trying to make a prediction of the future with specificity.
      • Jun 3 2013: I think we have reached the crux of the misunderstanding, time scale.

        When I said 'not too distant' I was thinking of my granddaughter, who is 15. She will probably be programming computers with natural language before she retires. I must admit that the way I think about time is probably not common.

        I was just using 15 years to illustrate what I meant by useful.
        • Jun 3 2013: Well, it seams like you have a completely different vision of what "not too distant future" means, I would call that: a "distant future", I can not speak for others, but to me your scale of time seams quite huge.
  • May 31 2013: That language exists since the 60's we call it: "C"... but to answer in more detail your question: first of all "The Matrix" is just a fantastic hollywood story which is far from being exact in many senses, secondly of all binary code is dependent on the hardware, so it is not universal, even the order of the bits is interpreted differently on different hardware. Thirdly, not all the languages "say" the same things but in different "words", you simply cannot write a database server in HTML because the necessary "words" simply does not exist and there is no possible way to create them. Fourthly, API is not a language but an arbitrary interface that can be used to connect 2 or more pieces of software. Now, why I say C is the so called universal language? first it runs in almost any hardware, second you can build anything from a small utility program to a full blown graphic operating system, as a matter of fact most modern operating systems were made in C. Any possible piece of software can be built in C, whatever a computer can do, can be programmed in C, no other language is as powerful as C... however that power comes with drawbacks, C can work wonders in the hands of a professional but it can be a pain in the aa... neck in the hands of a non professional, master pieces can come up form a mind with lots of imagination, but minds without imagination always produce frustration and disappointment, it is an extremely efficient language but it its hard to code and even harder to debug, so no wonder is not the language of choice for many. Finally connecting 2 pieces of software has more to do with design than with the language used, you can communicate 2 pieces of software written on 2 different languages if they are designed for such propose, and yet be unable to communicate 2 programs written on the same language if they are not designed accordingly. In conclusion: "meshing" is problem of design which has little or nothing to do with language.
    • thumb
      May 31 2013: Thank you greatly for your effort in replying, was a good read, I must however argue that binary and C are very different, I understood Binary it's off and on, however C was another's specialized set of vocabulary that equaled off and on and various forms.

      What I am suggesting is a language that has all previous languages before and after ( new ) defined in it's Database, so it can convert one soft from one language into code that is going to remain unchanged no matter what language was coded from, to keep a unified commonality to allow for things such as programing with basic language ( speak and code ). Sort of on the same lines as a technology that allowed one person to speak from one language to another speaking in another, the machine would understand both languages and convert accordingly to the native tongue of the other. A sort of Babelfish that had AI.

      And the matrix was of course fiction, it's just easier at times to use fiction to visualize reality, since life imitates art in many ways, movies are no exception.
      • May 31 2013: OK, understood... You idea is quite interesting, being said that, I have to tell you that it has a fundamental flaw. When you translate human languages, what you translate are concepts which are common to all human kind, those concepts exist in all languages, that is why they can easily be translated from one language to any other, translation of ideas is a little more difficult since words are organized differently on different languages, but that is not the point here. The central point of human language translation is the fact that there is a 1 to 1 correlation among concepts, which means any concept you can choose in English has an equivalent in Spanish, German, Portuguese, Russian or Chinese, even if they are expressed with 2 or 3 words the correlation at the concept level remains 1 to 1. If we think of programming languages, most of them are specialized on some task, SQL for example is designed to manage data on a database, Javascript is designed to manipulate HTML documents on a web browser, so there are concepts that exist in SQL that does not exist in Javascript and concepts that exist in Javascript that does not exist in SQL, so how could you possibly translate those languages when there is nothing to link them?... right now I can't figure out how. Another important point, human language translation is always reversible, however computer language translation may not, for example: if there was a way to translate SQL into C, just 1 line of SQL would yield thousands of lines of C code (I guess about 10,000), but there is no guaranty the process would be reversible. On human languages, the effort you put to translate let's say from English to Spanish is just the same you invest to translate from Spanish to English, however the effort to translate from SQL to C may be quite trivial while translating form C to SQL might be quite a challenge if not impossible.

        So once again, nice idea, but I think its not feasible, for the reasons I just exposed.