Machine learning: "Machine learning" means that a system improves on a particular task, according to a quantifiable measure, as a function of time. Carnegie Mellon's Tom Mitchell, a professor of Computer Science and an expert in the subfield of machine learning, defines it as follows:
A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measure by P, improves with experience E. For a computer program, to "learn from experience E" is to learn from data: computer programs can't "experience" human language use directly, for instance, but written or communication can be digitized and datafied for them, as we've seen.Machine learnig, because it does involve improvement in performance over time"is obviously relevant to the broader vision of AI as creating machines with human-like (or greater than human-like) intelligence.
For examples,in case of IBM's DeepBlue,while the computer lost its initial matches with Garry in 1997,the later matches all fell invariably in its favour, a clear-cut proof of how the computer learns from its opponent's moves,his shortcomings and also the verdicts of the previous games with him.
Artificial intelligence: The oxford English dictionary defines Artificial intelligence as an area of study concerned with making computers copy intelligent human behaviour.
Cognitive Computing:
the term cognitive computing has been used to refer to new hardware and/or software that mimics the functioning of the human brain and helps to improve human decision-making.