Friday 6 December 2013

Selection Sort, Quick Sort, Merge Sort, and Efficiency

Selection sort works by dividing the input list into two parts: one of which is sorted (and built up from left to right at the front of the list), and the rest are unsorted items. Initially, the entire list is unsorted, and the algorithm proceeds by finding the smallest element, exchanging it with the leftmost unsorted element and moving the boundary one element to the right. In terms of efficiency, it is O(n^2), and it generally performs worse than insertion sort.

Quick sort, also known as partition-exchange sort, works by choosing an element to be a pivot, dividing all of the elements into two partitions (all elements less than the pivot must be in the first partition, and all elements greater than the pivot in the second partition), using recursion to sort the partitions, and finally joining the first sorted partition, the pivot, and the second sorted partition. Regarding efficiency, the runtime of quick sort ranges from O(n log n) to O(n^2).

Merge sort is a comparison-based sorting algorithm. It works by dividing the list into n sublists, each containing one element. The sublists are repeatedly merged in order, to produce new sublists until there is only one sublist remaining - the sorted list. Thus the algorithm can be described as 'divide and conquer'. With respect to its efficiency, its worst case complexity is O(n log n) and best case complexity is O(n) (for pre-sorted input).


-reference: Wikipedia

Saturday 30 November 2013

Week 10 & Week 11

In week 10, the university was closed Monday and Tuesday, and there was a term test on Wednesday, so we did not learn any new material this week. I found the term test was difficult: I feel that I need to practice more examples using recursion, as this is where my weakness was in the test.

In week 11, we covered how to go about hiding attributes in python and the built-in-function property (which uses a getter, setter, deleter, and docstring) used to give nuanced access (like making an attribute read-only). We also talked about equivalence and functions that reduce. I felt fairly confident about the material covered this week, although it might have helped if we were practiced these concepts in an exercise.

Wednesday 13 November 2013

Week 8 & Week 9

In week 8, we covered binary search trees. These are binary trees that have a left node which contains  values smaller than the node, and a right node which contains values larger than the node. This is interesting because it makes searching the tree more easy since either the left or right subtree can be ignored based on the value which we are searching for. I felt fairly confident about binary search trees after learning about them. We also started to talk about running time analysis: big-oh. Big-oh is a way of describing an algorithm's performance: it is the algorithm's behaviour when given large input sizes.  Big-oh does not depend on the constant: for instance O(7lgn) and O(3lgn) are both described as having the algorithm O(lgn), because when given large input sizes, the effect of the constant (7 vs 3) is less. I also felt fairly confident about big-oh.

In week 9, we continued talking about big-oh. Big-oh is upper-bound and thus the algorithms in big-oh can be shown as subsets in a hierarchy. At around this time, we have also been learning about big-oh in CSC165, which made it easier to learn about. This week, we also covered selection sort, merge sort, and quick sort. I am a visual learner, and seeing Dr. Heap do the demonstrations of the sorts with the coffee cups helped me learn how each of these sorts work. We also performed running time analysis on these sorts. I felt fairly confident about the material covered this week as well. 

Wednesday 30 October 2013

Week 7: Linked Lists and Trees

This week we learned about linked lists and creating trees using linked lists. The material is fairly challenging. The tutorial was helpful as usual, and I learned about how to use and implement linked lists.

Saturday 19 October 2013

Week 5

There were no classes this week: Monday was off for Thanksgiving and the first test took place on Wednesday. The test went well: I liked that we were allowed to bring in a cheat sheet. There was also an exercise this week, which was more challenging than the rest of the exercises. Seeing as it is midterm season, I did not have a lot of time to spend on the exercise, and I ended up not finishing it, which was frustrating. One of the improvements that I would suggest for this course is a little more consistency, in terms of difficulty of assigned work and the amount that the work is worth.

Object-Oriented Programming and Recursion

Object-oriented programming is based on objects. Objects are software entities that contain both data attributes and methods. Objects are created from abstract data types that encapsulate data and functions together. Object-oriented programming is useful because data attributes can be hidden, and access to the data attributes is restricted to the object's methods. Thus they are protected from accidental corruption. Also outside code only interacts with the object's methods. Another benefit of object-oriented programming is object reusability: objects can be used by other programs that need its services.

A recursive function is a function that calls itself. Recursion is a powerful tool for solving repetitive problems. However, recursive calls are less efficient than loops. Each time a recursive function is called, the system must perform several actions (e.g. allocating the memory for parameters and local variables, storing the address of the program location) that are not necessary with a loop. On the other hand, some problems are more easily solved with recursion: if recursion does not slow system performance too much, recursion can be a good design choice.


Reference: Tony Gaddis, Starting Out With Python, 2nd Ed.

From the beginning...

Starting from the beginning, and up until the end of week 4, we learned about object-oriented programming, stacks, inheritance, exceptions, and recursion. I didn't have many problems learning about any of these topics, except for recursion (which I still can't really wrap my head around). 

I've found that the learning style seems to be mainly independent: although the topics are covered in lecture, I need to spend a fair amount of time doing readings and teaching myself content. I would have preferred for there 3 hours of lecture per week instead of 2 so that we could learn about the material more thoroughly in class. The weekly labs are a great way to learn, with the TA there to help you as  you work with your partner through relevant problems. The exercises have also been, for the most part, helpful and at a good level of difficulty. However, I found that the first assignment was far, far too difficult. I started working on it about a week before it was due (which I assumed was reasonable for an assignment worth 8%), but I was unable to finish. I am not sure if this is because assignments are meant to be very difficult or because this was a particularly challenging assignment, but I've learned to start the assignments as soon as they are assigned.

Finally, I think these SLOGs are a good way to let us express our opinions about the course, and also a good way of receiving weekly feedback about how things are going.