You are on page 1of 6

Object-Oriented Programming and Data Structures CS 2110, Fall 2013 Version: 1

Problem Set 6 Ultimate Buttery! Due at 11:59 PM, December 4 Last Modied: November 16, 2013

Preamble
Please read the whole handout before starting. At the end of this handout, we will tell you what and how to submit. We will ask you for the time spent on this assignment, so please keep track of the time you spent on it.

Collaboration Policy and Academic Integrity


You may do this assignment with one other person. Both members of the group should get on the CMS and do what is required to form a group well before the assignment due date. Both must do something to form the group: one proposes, the other accepts. People in a group must work together. It is against the rules for one person to do some programming on this assignment without the other person sitting nearby and helping. Take turns driving using the keyboard and mouse. With the exception of your CMS-registered group partner, you may not look at anyone elses code, in any form, or show your code to anyone else (except the course staff), in any form. You may not show or give your code to another student in the class.

Code Style
Finally, please pay attention to style. Refer to the CS 2110 style guide and lecture notes. Ugly code that is functionally correct may still lose points. Take the extra time to think out the problems and nd the most elegant solutions before coding them up. Good programming style is important for all assignments throughout the semester.

Overview
In A5, you completed a Depth First Search of a Map. Your buttery visited and learned the state of every yable Tile . You gained some preliminary experience working with the list of Flowers growing on the Map. In A6, we pull out all the stops and give you only a little formal guidance. The goal is to rst execute your method1 learn and then, in method run, to visit each Flower associated with a list of ower ids, one by one, using any method you like. We will award prizes for the best solutions, including ying time of the buttery, performance of your code (CPU time it takes to run), and code/documentation quality.

The same one from A5.

Note: For the CPU-time part of the competition, only submissions that give correct results on all test maps will be considered. We plan to use the top 10-20 correct submissions, rank-ordered by buttery ying time, as candidates for the CPU-time competition. All submissions will be compiled and tested under identical conditions.

The Danaus World


In prior assignments, we havent told you everything there is to know about Danaus; you didnt need all the information. But for A6, understanding the details of Danaus is imperative. For a complete overview of Danaus mechanics, please read the Danaus reference distributed with this handout. Note: You have already seen some of the reference material in A3 and A5, so reading the reference will be easier.

Your Tasks
Implement learn . Implement run. For a detailed description of learn and run, refer to the Danaus Reference. We will not release solutions to A5. You must use your solution to A5 to build A6.

A simple solution
A simple but very inefcient solution is to have run() re-execute the same code that learn () used, while collecting owers along the way. This will work, but it runs into an efciency issue. Your Buttery will make many superuous actions and will operate so inefciently that it might easily run out of power (if this happens, the buttery must pause to recharge its battery). Recall that when using Depth First Search, the buttery sometimes chooses extremely indirect ight paths from point A to point B. For example, Figure ?? shows a ight path from A5. Pretend the ower you are looking for is on the bottom left corner of the map.

Figure 1: Depth First Search step one. In this example, the buttery started at the top left and searched the park until it was nally 45 levels of recursion deep in the Depth First Search and at a dead end. To the W and SW there are cliffs. The tiles in directions NW, N, NE, E, SE and S have all been visited. So the A5 version of learn had the buttery backtrack: the recursive method returns, and the buttery ies the reversed path back, step by step. It takes the buttery 26 turns to get to a location that was actually one hop away! To see that, we have colored the superuous moves yellow in Figure ??. Notice that we could have reached this same spot in just one movement from tile (3,3), where the buttery hit the dead end, NW directly to tile (2,2)! Thus, although you can solve A6 by just re-running learn (but without much hope of winning the competition), if you improve the buttery ight path a bit, you can save a LOT of ying time.

Figure 2: Depth First Search step two. How could you design a better ight-planning algorithm? We leave that to you, but some ideas to consider include the shortest-path algorithm (Dijkstra) and the spanning tree algorithms we learned about in class. In fact you can do even more than this: after all, at the end of learn () , the buttery will have learned about the locations of many of the Flowers it is expected to visit. Thus even before run takes a single step, you could preplan a path that will take the buttery to one ower after another reasonably efciently! The only challenge is that we may have added new owers, not seen in learn () , to the list of Flowers you need to collect. When doing ight planning, keep in mind that it is usually safe to y slowly. But if you plan to y quickly, make sure to do it in bright sunlight. Flying quickly in the shade is usually a mistake! So what about those extra owers? Here, once again, there are some choices to make. Well outline three ways to nd them. (1) As the buttery visits the owers you already know about, call refreshState () once more for each tile it revisits for the rst time. Maybe youll get lucky and y right over one you need. (2) If the list of owers (the argument to method run) includes even a single new ower, you could just reexecute the same kind of search that was used by learn () on the whole yable map and collect the owers as you nd them. This will always work but could be a bit slow. (3) Use the science of Flower Smellology to determine the locations of the owers based on what you initially learned and a tiny amount of additional data. Based on this, add predicted ower locations to your list of locations to visit. Then do optimal ight planning. Then y the best possible path to each ower, one by one. The winner of the speed competition is

likely to use this method. On the other hand, using this approach without doing a huge amount of computation could involve some hard work and smart data structures!

How We Grade
We will grade A6 according to three criteria of descending importance: correctness, clarity and readability, and performance. We will rst check the correctness of your methods learn () and run(). Did the TileState [][] you created in learn () correctly discover the TileState of every yable tile? This was the goal in A5, but we re-check it in A6. Next, we check the collected samples from the full list of owers we provided in any order, including those that started to bloom just before we called run(). A solution is correct if it passes these tests. The award for correct code is a good grade. Next, we will evaluate the clarity and readability of your code. Ugly and impenetrable code will not be considered for the clarity prize, and it could result in a lower grade. Make sure to add comments where appropriate, follow good variable naming conventions, etc. Among the correct and clear solutions, we compare the performance metrics described in the attached reference. Specically, there are four categories in which to win quantitative awards. (1) Fewest number of turns, including slowTurns. (2) Fastest actual learn time. (3) Fastest actual run time. (4) Fastest actual total time. Prize will go to the solutions that have the best average score across our full set of maps. Also, the buttery must be within the top 25% for each individual map we test. This rule is intended to avoid awarding the fastest buttery prize to a solution that is very fast on most maps but unusually slow on some of them. There may also be other subjective awards distributed at the discretion of the course staff. For example, most creative solution. Most sophisticated algorithm. Etc. Note: Do not cheat (e.g. by breaking the Java language rules in some way to gain unfair access to the map). Do not use the Java reection mechanisms. If any code is caught cheating, it will be disqualied from competition. Your grade will also suffer.

Note: Some algorithms may be computationally complex. While we dont grade on efciency, we will enforce that your functions terminate in a reasonable amount of time. If your code requires more time than you have patience, we will deduct points.

PS: We may give awards to more than just the single best performer in each category. Second and third place prizes will be awarded when deemed appropriate.

PPS: The prizes are mostly symbolic and vary across categories. Some categories are more difcult to do well in, and they will likely yield better awards. We usually use CTB gift cards. And of course you can list that you won the prize on your resume.

Deliverables
Checklist
Before you submit your assignment, check to see that you have completed the following tasks. You have read this document and the reference in excruciating detail. You have reminded yourself about the academic code of integrity. You have reviewed CS 2110s coding style guidelines, and your code adheres to the guidelines: The elds of class Buttery are annotated with the class invariant. Each method of class Buttery has a good javadoc specication. Your class Buttery is in package student, not in package danaus. You have placed the total time spent on the assignment at the top of Buttery in the following format. /* Time Spent: XX hours and YY minutes. */ If you have a partner, you have formed a group on the CMS. This must be done before you submit. The two of you will receive a single shared grade.

Submission
On the CMS, submit two les. Buttery.java. This le contains subclass Buttery, which contains your new method learn(), built in the way we have described, and method run(). README.txt. This le should contain a brief explanation of your algorithm run(). Also, give your opinion of this project. README.txt will not be graded, so you need not labor over writing a README magnum opus.

You might also like