Living and Giving

Ada Lovelace: The First Computer Programmer

Share This Post

I know you are thinking of Steve Wozniak or Steve Jobs.   But guess what? There was a lady computer programmer 100 years earlier!
Ada Lovelace.  She was the daughter of Lord Byron, a superb poet!
Ada\’s mentor was Charles Babbage. He was a mathematician, philosopher, and inventor. He and Ada Lovelace for conceptualizing the first programmable computer. They developed the Difference Engine, an invention that would have worked as a functioning computer had it been completed.
At the age of 17, Ada wrote the first algorithm that computed numbers!  But she thought it could be so much more.
They were never able to finish the Difference Engine, but more than 100 years, later, Woz and Jobs did. Her thought contributed to their invention.
Celebrating early engineers, early pioneering women!

Ada Lovelace: The First Computer Programmer

Ada Lovelace was the only legitimate daughter of Lord Byron, one of England’s most famous poets. Her parents separated shortly after Ada’s birth, and Byron left England. He died in Greece a few years later. Although she never knew her father, Byron\’s legacy greatly influenced Ada’s upbringing. Her mother was paranoid that she would inherit her poet father\’s erratic temperament, and made sure that she was tutored in mathematics and science.

When Ada was 17, her mentor Charles Babbage showed her the prototype for his ‘Difference Engine,’ the world’s first computer. In 1842, Babbage asked Lovelace to help translate an article about the plans for his newest machine, the ‘Analytical Engine.’ She appended a lengthy set of notes to her translation, in which she wrote an algorithm that the engine could use to compute Bernoulli numbers.

While the extent of her original contribution is disputed, her code is now considered the world’s first computer program. Lovelace theorized that the machine might eventually do far more than calculating numbers. Babbage’s engine was never built and her code was never tested, but many of her insights about the future of computing proved to be true.