Students are expected to learn about a wide variety of topics, from big O notation and calculus, through networking protocols and layers, to computer architecture.
In computer science, big O notation is used to classify algorithms by how they respond ("e.g.," in their processing time or working space requirements) to changes in input size.