Posts

Showing posts from July, 2013

Asymptotic Notation

Image
Asymptotic Notation, or Landau Notation Asymptotic Notation is used to describe the limiting behavior of a function as it approaches a particular value or infinity. We have already seen a glimpse of this in the last post when we were discussing Order of Growth of functions.  Ex: O(n) When discussing asymptotic notation, there are three key terms: Theta Notation - A function that bounds a function within constant factor after a particular n. Big O Notation - represents the upper bound of Theta-Notation. Big Omega Notation - represents the lower bound of theta-notation. Taken from:  Florida State University In the graph above, (c2 g(n)) represents Big O Notation, because for each n after n0, O(g(n)) >= f(n).   (c1 g(n)) represents Big Omega Notation because for each n after n0, Omega(g(n)) <=  f(n). There is an O-notation and Omega-notation, so therefore there exists a Theta-notation for the function f(n). For O-notation and Omega-notation are both the

Algorithms - Introduction to the Basic Concepts

Image
Chapter 2 - Getting Started Loop Invariant, Case Analysis, Order of Growth Checking the loop invariant When analyzing an algorithm using a loop invariant (which consists of the elements originally in positions that have been sorted already) to see if it is correct, we check 3 properties: Initialization - Loop invariant must be true prior to the first iteration of the loop. Maintenance - Must be true before the beginning of a loop, and remain true before the next iteration of the loop. Termination - When the loop terminates, the loop invariant must yield correct results that helps shows that the algorithm is correct. The first two properties helps to shows that the loop of the algorithm is correct.  The third properties is a confirmation that the algorithm has worked. Lets put this into a better understanding now, by using insertion sort. List given:  4 3 1 2 5 6 Initialization - Since insertion sort starts at the second element, the only part that is in the lo

Algorithms - Ch 1 Foundations

Late post b/c vacation and stuff, but I'm back now. One of my algorithm books came in today.  I decided to order two, one for my class coming up this quarter, and another that I found regarded as the standard in the field. Algorithm Design: Foundations, Analysis, and Internet Examples   by Michael T. Goodrich (class) Link:  http://amzn.com/0471383651 Introduction to Algorithms   by Thomas H. Cormen Link:  http://amzn.com/0262033844 Introduction to Algorithms just came in and so far it a pretty good read.  It comes across very accessible to newbies like myself, but has a rigor and complexity to it still.  For the next couple of weeks, I think I will focus on burning through the first few chapters of this book, which will help me with my class this quarter.  I'll probably still work on Java projects, but I think that it will be a side thing I do in my spare time for now. Ch 1 - Foundations 1.1 Role of Algorithms in Computing Algorithms take an input an

Java Multithreading - Part 1

I brought up the topic of concurrency in the Sorting post(concurrent merge sort), but never really got to explain properly what it actually was.  So in this blog post, I will go into what concurrency means, Java's tools approach to enabling this concept, and some common problems that arise when dealing with concurrency. The Concept Concurrency is the ability to be able to do more than one task at a time.  Sometimes the word parallelism is also intertwined in this definition, but depending on who you talk to, may not always be the same definition.  The way I see the difference between the two terms, concurrency is the idea that different programs can overlap and interleave with one another to create an experience that appears to the user that multiple applications are being run at the same time, but is actually a trick played that gives small time slices of processing usage.  Parallelism is the idea that two programs can actually run at the exact same time, enabled by the CPU s

BlackJack Project

I decided to do something different for this post.  As much fun as it is to research and study programming concepts, sometimes you just feel like coding something up that does something!  In my case, I was inspired by a card dealing method that appeared in one of my previous blog posts and I decided to make the simple game of BlackJack. For those unfamiliar, BlackJack is the casino game of 21, where you play against the dealer to try and get as close as you can to the card value 21 as possible without going over.  Originally, you start out with two cards, and you may draw a card or stay with your current cards. The dealer must play with one card up and one card down to begin with.  I decided to go with hard 17, where the dealer must draw until he has 17 or more card value.  To win money, you must beat the dealer, without busting(going over 21).  Face cards are worth 10, while Aces can be 1 or 11.  I purposely left out splitting and the gambling aspect, which I will return to in the