DEV Community

Ashwani Kumar Shamlodhiya
Ashwani Kumar Shamlodhiya

Posted on

AI Evolution

AI Is Not New
• AI was formally coined in 1956 at the Dartmouth Conference. We have been building AI for last 70 years
• Early pioneers like Alan Turing laid the groundwork (Turing Test).
The test involves a human judge who interacts with both a machine and another human, without knowing which is which. Communication occurs through text, allowing the judge to ask questions and receive responses from each participant.
The test’s goal is simple: if the judge cannot consistently identify which respondent is the machine, the machine is considered to have “passed” the Turing Test
• 1950s-1970s: Early AI programs, such as chess-playing programs, emerged

1️. Initially were Rule-Based Systems or used symbolic reasoning
(1950s–1980s)
Rule-based: "If X, then Y. Else, do Z."
These were expert systems. Think of them as giant flowcharts built by humans with hand-coded rules.
Example:
• E.g.1: In healthcare sector, rule based system helps diagnose diseases by checking if symptoms match a list.
If ( fever, blood_marker_number > 40, rash_size > 20mm): then skin_cancer
else: no skin cancer

We can represent knowledge in the form of if–then rules:

  1. Rule 1: IF patient has fever AND cough THEN patient might have flu.
  2. Rule 2: IF patient has fever AND body ache AND sore throat THEN patient might have flu.
  3. Rule 3: IF patient has fever AND cough AND loss of smell THEN patient might have COVID-19.

How Symbolic Reasoning Works
• Input (facts): Patient has fever and Patient has cough.
Rule 1 matches → conclusion: "Patient might have flu."

• E.g.2: In the financial sector, rule-based system might flag a transaction as suspicious if it exceeds a certain amount and occurs in an unusual location or at an odd time.
If ( transaction > 30000, transaction > max_amt_spent_till_today, transcation_time=1AM, location=DarkAlley,): then suspicious
else: not suspicious
These systems apply predefined rules to transaction data to identify anomalies.
• No learning. No adaptation. Just rules.

2️. Machine Learning
(1990s–2010s)
“Don’t tell me the rules—show me the data and I’ll figure it out.”
Algorithms began learning from patterns in data. This was aka as statistical learning
Key Milestones:
• 1990s-2000s: The development of more advanced ML algorithms, including support vector machines and decision trees.

Example:
• Spam filters that learn to block new types of spam based on examples.
• Recommendation systems (like Amazon and Netflix)

3️. Deep Learning
(2012–Now)
Deep neural networks started outperforming humans in image recognition, speech, and more. Inspired by neurons in the brain, with multiple layers (hence deep learning).
Breakthroughs in convolutional neural networks (CNNs) and recurrent neural networks (RNNs) led to major advancements in image and speech recognition.

Example:
• Self-driving cars
• Facial recognition
• Real-time language translation

4️. Generative AI
(2020s–Future)
“Now I can learn, imagine, write, and dream new things.”
These models generate brand new content: text, art, code, music—even deepfakes.
Example:
• ChatGPT writing stories
• DALL·E creating images from words
• MusicLM generating songs in custom styles

Top comments (0)