A computer course that finally focuses on understanding

What you will learn

A unique explanation about binary numbers

A refreshing new way to learn electronics

How to build logic gates, adders using transistors

2s complement: how computers do subtractions

How computers do multiplication and division

What does a transistor do

Description

Understanding how computers compute is the first step towards understanding how computers work. There are three building blocks in understanding how computers compute.

1. How computers represent numbers: Binary numbers

2. Who does the computing: Transistor


Get Instant Notification of New Courses on our Telegram channel.


3. How to combine transistors to do computing: Building an adder

The first hurdle in learning how computers work is binary number. A lot of people have trouble in seeing binary numbers as legit numbers, myself included. So this course aims to provide you a thorough understanding about numbers from a unique perspective. After comfortable with binary numbers, we will talk about the little device at the core of modern electronics: transistor. We won’t get into the quantum mechanics detail behind the transistors fundamental to our modern way of life, but we will briefly talk about what it does. This will pave way for a good understanding about how to use them to do computing. Once we are comfortable with how numbers are represented and what the little computing device can do, we will dive into the details about how to combine those little transistors in making an adder that can add two numbers. We will also talk about how computers do subtraction using a trick called two’s complement method.

This mini course will equip you with a thorough understanding about how computers do computing exactly.

English
language

Content

Introduction

Some common confusions

Numbers

How binary numbers work?

Computing

What is computing?

Who does the computing in computers?

Transistor

How computers add?

Get some clues about adding two bits
Some electricity knowledge
Finishing the half adder
Make a full adder

How computers subtract?

Two’s complement method: Theory
Two’s complement method: Implementation

Onward

Onward