English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 22h 32m | 8.39 GB
These Computer Science Video Lectures cover fundamental concepts that provide a first step in understanding the nature of computer science’s undeniable impact on the modern world. They cover basic elements of programming, algorithms and data structures, theory of computing and machine architecture, all in the context of applications in science, engineering, and commerce.
The basis for education in the last millennium was “reading, writing, and arithmetic;” now it is reading, writing, and computing. Learning to program is an essential part of any working professional’s skill set, not just programmers and engineers, but also artists, scientists, and humanists, as well. This collection of video lectures aims to teach programming to those who need or want to learn it, in a scientific context. But computer science is much more than just programming. These lectures also aim to teach fundamental concepts of the discipline, which can serve as a basis for further study of some of the most important scientific ideas of the past century.
What You Will Learn
- Basic elements, including variables, assignment statements, built-in types of data, conditionals and loops, arrays, and I/O, including graphics and sound
- Functions and modules, stressing the fundamental idea of dividing a program into components that can be independently debugged, maintained, and reused
- Object-oriented programming, centered on an introduction to data abstraction
- Applications, drawing examples from applied mathematics, the physical and biological sciences, and computer science itself
- Algorithms and data structures, emphasizing the use of the scientific method to understand performance characteristics of implementations
- Theory of computing, which helps us address basic questions about computation, using simple abstract models of computers
- Machine architecture, providing a link between the abstract machines of the theory of computing and the real computers that we use
- Historical context, including the fascinating story of the development and application of fundamental ideas about computation by Alan Turing, John von Neumann, and many others
PART I: PROGRAMMING IN JAVA
Lecture 0: Prologue—A Simple Machine
This lecture introduces fundamental ideas of computation in the context of a familiar and important application from the field of cryptography. The story motivates the study of computer science, but the concepts covered are a bit advanced, so novices may wish to review it again after watching the other lectures in the course.
Lecture 1: Basics. Why program? This lecture addresses that basic question. Then it describes the anatomy of your first program and the process of developing a program in Java using either virtual terminals or a program development environment, with some historical context. Most of the lecture is devoted to thorough coverage of Java’s built-in data types, with example programs for each.
Lecture 2: Conditionals and Loops. The if, while, and for statements are Java’s fundamental control structures. This lecture is built around short programs that use these constructs to address important computational tasks. Examples include sorting, computing the square root, factoring, and simulating a random process. The lecture concludes with a detailed example illustrating the process of debugging a program.
Lecture 3: Arrays. Computing with a large sequence of values of the same type is extremely common. This lecture describes Java’s built-in array data structure that supports such applications, with several examples, including shuffling a deck of cards, the coupon collector test for randomness, and random walks in a grid.
Lecture 4: Input and Output. To interact with our programs, we need mechanisms for taking information from the outside world and for presenting information to the outside world. This lecture describes several such mechanisms for text, drawings, and animation. Detailed examples covered include fractal drawings that model natural phenomena and an animation of a ball bouncing around in the display window.
Lecture 5: Functions and Libraries. Modular programming is the art and science of breaking a program into pieces that can be individually developed. This lecture introduces functions (Java methods), a fundamental mechanism that enables modular programming. Motivating examples include functions for the classic Gaussian distribution and an application that creates digital music.
Lecture 6: Recursion. A recursive function is one that calls itself. This lecture introduces the concept by treating in detail the ruler function and (related) classic examples, including the Towers of Hanoi puzzle, the H-tree, and simple models of the real world based on recursion. We show a common pitfall in the use of recursion, and a simple way to avoid it, which introduces a different (related) programming paradigm known as dynamic programming.
Lecture 7: Performance. When you develop a program, you need to be aware of its resource requirements. In this lecture, we describe a scientific approach to understanding performance, where we develop mathematical models describing the running time of our programs and then run empirical tests to validate them. Eventually we come to a simple and effective approach that you can use to predict the running time of your own programs that involve significant amounts of computation.
Lecture 8: Abstract Data Types. In Java, you can create your own data types and use them in your programs. In this and the next lecture, we show how this ability allows us to view our programs as abstract representations of real-world concepts. First we show the mechanics of writing client programs that use data types. Our examples involve abstractions such as color, images, and genes. This style of programming is known as object-oriented programming because our programs manipulate objects, which hold data type values.
Lecture 9: Creating Data Types. Creating your own data types is the central activity in modern Java programming. This lecture covers the mechanics (instance variables, constructors, instance methods, and test clients) and then develops several examples, culminating in a program that uses a quintessential mathematical abstraction (complex numbers) to create visual representations of the famous Mandelbrot set.
Lecture 10: Programming Languages. We conclude the first half of the course with an overview of important issues surrounding programming languages. To convince you that your knowledge of Java will enable you to learn other programming languages, we show implementations of a typical program in C, C++, Python, and Matlab. We describe important differences among these languages and address fundamental issues, such as garbage collection, type checking, object oriented programming, and functional programming with some brief historical context.
PART II: ALGORITHMS, THEORY, and MACHINES
Lecture 11: Searching and Sorting. Building on the scientific approach developed in Part 1 (Lecture 7), we introduce and study classic algorithms for two fundamental problems, in the context of realistic applications. Our message is that efficient algorithms (binary search and mergesort, in this case) are a key ingredient in addressing computational problems with scalable solutions that can handle huge instances.
Lecture 12: Stacks and Queues. Our introduction to data structures is a careful look at the fundamental stack and queue abstractions, including performance specifications. Then we introduce the concept of linked structures and focus on their utility in developing simple, safe, clear, and efficient implementations of stacks and queues.
Lecture 13: Symbol Tables. The symbol table abstraction is one of the most important and useful programmer’s tools, as we illustrate with several examples in this lecture. Extending the scientific approach of the previous two lectures, we introduce and study binary search trees, a classic data structure that supports efficient implementations of this abstraction.
Lecture 14: Introduction to Theory of Computation. The theory of computation helps us address fundamental questions about the nature of computation while at the same time helping us better understand the ways in which we interact with the computer. In this lecture, we introduce formal languages and abstract machines, focusing on simple models that are actually widely useful in practical applications.
Lecture 15: Turing Machines. In 1936, Alan Turing published a paper that is widely hailed as one of the most important scientific papers of the 20th century. This lecture is devoted to the two far-reaching central ideas of the paper: All computational devices have equivalent computational power, and there are limitations to that power.
Lecture 16: Intractability. As computer applications expanded, computer scientists and mathematicians realized that a refinement of Turing’s ideas was needed. Which computational problems can we solve with the resource limitations that are inescapable in the real world? As described in this lecture, this question, fundamentally, remains unanswered.
Lecture 17: A Computing Machine. Every programmer needs to understand the basic characteristics of the underlying computer processor being used. Fortunately, the fundamental design of computer processors has changed little since the 1960s. In this lecture, we provide insights into how your Java code actually gets its job done by introducing an imaginary computer that is similar to both the minicomputers of the 1960s and the microprocessor chips found in today’s laptops and mobile devices.
Lecture 18: von Neumann Machines. Continuing our description of processor design and low-level programming, we provide context stretching back to the 1950s and discuss future implications of the von Neumann machine, where programs and data are kept in the same memory. We examine in detail the idea that we design new computers by simulating them on old ones, something that Turing’s theory guarantees will always be effective.
Lecture 19: Combinational Circuits. Starting with a few simple abstractions (wires that can carry on/off values and switches that can control the values carried by wires), we address in this lecture the design of the circuits that implement computer processors. We consider gates that implement simple logical functions and components for higher-level functions, such as addition. The lecture culminates with a full circuit for an arithmetic/logic unit.
Lecture 20: CPU. In this lecture we provide the last part of our answer to the question “How does a computer work?” by developing a complete circuit for a computer processor, where every switch and wire is visible. While vastly different in scale, this circuit, from a design point of view, has many of the same characteristics as the circuits found in modern computational devices.
Table of Contents
01 Introduction to Part I
02 Brief introduction
03 Secure communication with a one-time pad
04 Linear feedback shift registers
05 Implications
06 Why programming
07 Program development
08 Built-in data types
09 Type conversion
10 Conditionals – the if statement
11 Loops – the while statement
12 An alternative – the for loop
13 Nesting
14 Debugging
15 Basic concepts
16 Examples of array-processing code
17 Two-dimensional arrays
18 Standard input and output
19 Standard drawing
20 Fractal drawings
21 Animation
22 Basic concepts
23 Case study – Digital audio
24 Application – Gaussian distribution
25 Modular programming and libraries
26 Foundations
27 A classic example
28 Recursive graphics
29 Avoiding exponential waste
30 Dynamic programming
31 The challenge
32 Empirical analysis
33 Mathematical models
34 Doubling method
35 Familiar examples
36 Overview
37 Color
38 Image processing
39 String processing
40 Overview
41 Point charges
42 Turtle graphics
43 Complex numbers
44 Popular languages
45 Java in context
46 Object-oriented programming
47 Type checking
48 Functional programming
49 Introduction to Part II
50 A typical client
51 Binary search
52 Insertion sort
53 Mergesort
54 Longest repeated substring
55 APIs
56 Clients
57 Strawman implementation
58 Linked lists
59 Implementations
60 APIs and clients
61 A design challenge
62 Binary search trees
63 Implementation
64 Analysis
65 Overview
66 Regular expressions
67 DFAs
68 Applications
69 Limitations
70 Context
71 A simple model of computation
72 Universality
73 Computability
74 Implications
75 Reasonable questions
76 P and NP
77 Poly-time reductions
78 NP-completeness
79 Living with intractability
80 Overview
81 Data types
82 Instructions
83 Operating the machine
84 Machine language programming
85 Perspective
86 A note of caution
87 Practical implications
88 Simulation
89 Building blocks
90 Boolean algebra
91 Digital circuits
92 Adder circuit
93 Arithmetic_logic unit
94 Overview
95 Bits, registers, and memory
96 Program counter
97 Components, connections, and control
Resolve the captcha to access the links!