Machine Learning is the new C++

Discussion in 'App Development' started by xandman, May 28, 2015.

  1. I come from a literature background and I know how to spell "originates" :)

    You're talking plain nonsense I'm afraid. I'll ask you this question:

    Do you know of any adaptive control and estimation algorithm that will allow a machine to run, read, dress, cry, laugh, etc., in five to six years from implementation? This is what he meant by infant's intelligence.

    The best AI algorithm in the next 100 years will still be inferior to the brain of a monkey. Digest it and if needed drink a cold glass of water along with your cornucopians, techno-optimists, and singularitans friends. We had enough of this AI bs already.

    I also warn traders here to avoid any product that displays the AI, adaptive, machine learning, genetic or neural network designation because they risk losing their money.
     
    Last edited: Jun 1, 2015
    #11     Jun 1, 2015
  2. IAS_LLC

    IAS_LLC

    Literature backgrounds prevent typos and the digestion of the true meaning of a statement? Interesting.

    What did I say that was nonsense? I said AI isn't real learning... Its pattern recognition. I also said the worst AI algorithm is more USEFUL than the smartest infant. I never said an adaptive control algorithm possesed the ability to learn how to do arbitrary things... what they do is collect data on the way the system responds to a limited set of inputs, establishes I/O correlations, and accomplishes a defined task more effeciently than any human ever could... AI isn't real intelligence, its pattern recognition and optimization, like I said... but it is more USELFUL than MOST humans, for many applications.
     
    Last edited: Jun 1, 2015
    #12     Jun 1, 2015
    dartmus likes this.
  3. How do you know that learning isn't pattern recognition?
     
    #13     Jun 1, 2015
    dartmus likes this.
  4. xandman

    xandman

    Dangerous stuff, indeed. I think GP has potential for testing your general method across numerous possible optimizations.

    For example, if an n-day breakout system is more profitable than an MA crossover across various GA optimizations on average. Wouldn't you say that n-day breakout methods are better than MA crossover for that data on average?

    Flawed logic? Tyranny of averages issue? Perhaps, I could have made life easier with Monte Carlo?
     
    #14     Jun 1, 2015
  5. Quite a few people are trying judging by the few emails I still get from desperate headhunters which don't drop automatically into my spam folder. Most are looking for "data scientists" with "Big data, AI and machine learning experience".

    In fairness these are not all finance jobs. Of the real finance jobs probably half are greenfield start ups jumping on a hot bandwagon with a few million of some idiots VC money. The other half are large one stop shop (what used to be called bancassurance) that can afford to piss away a few million on a series of vanity hires: head of big data will join the heads of corporate social responsibility, and head of imagination in the queue of people to get fired if HQ budgets ever get cut.

    I think "big data" is good for telling you what patterns exist in an existing data set, eg supermarket customer loyalty. I'm less convinced that it can be used to predict prices. In the rawest sense if the relationships you are looking for only exist in "big data" then they are probably quite weak. I'm thinking of the sort of thing that econometrically has a high t-statistic but a very low R squared; where you fit several dozen parameters just because you have several million observations and you can.
     
    #15     Jun 1, 2015
  6. IAS_LLC

    IAS_LLC

    Valid question. Pattern recognition is a component of learning. Recognition doesn't imply intelligence. Understanding does.
     
    #16     Jun 1, 2015
    dartmus likes this.
  7. xandman

    xandman

    Good point with Big Data.

    Referencing Anti Ilmanen on the public preference for lottery-like payoffs, the social web is probably one of the major applications of BD/ML for harvesting this premia.
     
    #17     Jun 1, 2015
  8. What is understanding?
     
    #18     Jun 1, 2015
    dartmus likes this.
  9. IAS_LLC

    IAS_LLC

    Its the difference between AI and real intelligence (as I see it). The AI algorithms I've seen are basically trying to minimize some user defined "cost". I haven't seen anything that is able to come up with its own cost function (that doesn't mean it doesn't exist...I'd love to here about it if it does). I define understanding (in this context) as being able to formulate a problem statement, and a metric to describe optimality. Someone who understands the problem very well most formulate the problem in a very specific way for AI to be useful.

    I guess what I'm trying to say is AI is like a Marines rifle...it is only as good as the soldier (programmer) behind the scope (keyboard)



     
    #19     Jun 2, 2015
    dartmus, lucysparabola and eusdaiki like this.
  10. Humpy

    Humpy

    Machine learning is improving but I think the major advances in intelligence may be in the bio fields when they make humans much smarter. Maybe some animals like dogs too.

    Hey Rover go fetch the newspaper and pay the man.
     
    #20     Jun 2, 2015
    dartmus, eusdaiki and IAS_LLC like this.