Search Results

Search found 23207 results on 929 pages for 'node form'.

Page 367/929 | < Previous Page | 363 364 365 366 367 368 369 370 371 372 373 374  | Next Page >

  • Share variable across site ASP.NET

    - by Anders
    I have a class isSearching with a single boolean property in a 'functions' file in my webapp. On my search page, I have a variable oSearchHandler declared as a Public Shared variable. How can I access the contents of oSearchHandler on other pages in my webapp? Code with Session.... 'search.aspx Public Function oSearchString(ByVal oTextBoxName As String) As String For Each oKey As String In Request.Form.AllKeys If oKey.Contains(oTextBoxName) Then Session.Add("searching", True) Session.Add("search-term", Request.Form(oKey)) Return Request.Form(oKey) End If Next Return "" End Function 'theMaster.master <% If Session("searching") Then %><ul style="float: right;"> <li> <div class="gsSearch"> <asp:TextBox ID="searchbox" runat="server"></asp:TextBox> </div> </li> <li> <div class="gsSearch"> <asp:Button ID="searchbutton" runat="server" Text="search" UseSubmitBehavior="true" PostBackUrl="search.aspx" CssClass="searchBtn" /> </div> </li> </ul> <% End If %> I think that the session will work just fine.

    Read the article

  • JavaScript two-dimensional Array to PHP

    - by vi
    Hi I have to send a two-dimensional JavaScript Array to a PHP page. Indeed, I'm working on a form-builder, in which the user can add or remove fields. These fields are added (or removed) using JavaScript (jQuery). When the user is done and hit a 'publish' button, I have to get all the fields concerned and send them to a PHP page which would build a real form with it. I found a way to do it but I'm pretty sure it's not very clean : addedFields = new Array(); $("#add-info .field").each(function() { addedFields.push(new Array($(this).find('.name').val(), $(this).find('.type').val(), $(this).find('.size').val())); }); Basically, the ".field" class objects are <tr> and the ".name", ".type" and ".size" objects are inputs. So I get an array of [name, type, size], then I convert it into a string using addedFields = addedFields.join(";"); Finally, I go to the PHP form that way ; document.location.href = "create.php?addedfields=" + addedFields; Concerning the PHP code, I create a PHP array using the explode() function: $addedFields = explode(";", $_GET['addedfields']); and then I use it again for each element in the array: foreach ($addedFields as $field) { $field = explode(",", $field); echo "<li>Field with name : '$field[0]', of '$field[1]' type and with a size of $field[2]"; }

    Read the article

  • Rails from with better url

    - by Sam
    wow, switching to rest is a different paradigm for sure and is mainly a headache right now. view <% form_tag (businesses_path, :method => "get") do %> <%= select_tag :business_category_id, options_for_select(@business_categories.collect {|bc| [bc.name, bc.id ]}.insert(0, ["All Containers", 0]), which_business_category(@business_category) ), { :onchange => "this.form.submit();"} %> <% end %> controller def index @business_categories = BusinessCategory.find(:all) if params[:business_category_id].to_i != 0 @business_category = BusinessCategory.find(params[:business_category_id]) @businesses = @business_category.businesses else @businesses = Business.all end respond_to do |format| format.html # index.html.erb format.xml { render :xml => @businesses } end end routes map.resources What I want to to is get a better url than what this form is presenting which is the following: http://localhost:3000/businesses?business_category_id=1 Without rest I would have do something like http://localhost:3000/business/view/bbq bbq as permalink or I would have done http://localhost:300/business_categories/view/bbq and get the business that are associated with the category but I don't really know the best way of doing this. So the two questions are what is the best logic of finding a business by its categories using the latter form and number two how to get that in a pretty url all through restful routes in rails.

    Read the article

  • listbox select count not work dont work

    - by sadpcd
    I recently trying to make a form with multiple select box. When someone select the options the number of selected options will be display on another text. I'm a beginner in JavaScript. The function is called, but it doesn't count the number of the selected options. <select name="element_17_1[ ]" size="7" multiple="multiple" class="element select medium" id="element_17_1[ ]" onfocus="selectCount(this.form);" onClick="selectCount(this.form);" > <option value="Opt1">Opt1</option> <option value="Opt2">Opt2</option> <option value="Opt3">Opt3</option> <option value="Opt4">Opt4</option> <option value="Opt5">Opt5</option> <option value="Opt6">Opt6</option> <option value="Opt7">Opt7</option> </select> and this is the function I tried in the <head> function selectCount(f) { var selObj = myForm.elements['element_17_1[]']; var totalChecked = 0; for (i = 0; i < selObj.options.length; i++) { if (selObj.options[i].selected) { totalChecked++; } } f.element_9.value = totalChecked; }

    Read the article

  • jQuery: serializing array returns empty string

    - by John Smith
    I did not forget to add name attributes as is a common problem and yet my serialized form is returning an empty string. What am I doing wrong? HTML/javascript: <head> <script src="http://ajax.googleapis.com/ajax/libs/jquery/2.1.0/jquery.min.js"></script> <script> $( document ).ready( function() { $('#word_form').submit(function(e) { e.preventDefault(); console.log($(this).serialize()); //returns an empty string }); }); </script> </head> <body> <div id="wrapper"> <form name="word_form" id="word_form" method="POST"> <input type="image" name="thumbsUp" id="thumb1" value="1" src="http://upload.wikimedia.org/wikipedia/commons/8/87/Symbol_thumbs_up.svg" style="width:50px;height:50px;"> <input type="image" name="thumbsDown" id="thumb2" value="2" src="http://upload.wikimedia.org/wikipedia/commons/8/84/Symbol_thumbs_down.svg" style="width:50px;height:50px;"> </form> </div> </body> Thanks!

    Read the article

  • Why can I not echo out the value from an input text?

    - by user3684783
    I am using Wordpress to do an auction website. Here is what the code looks like <form method="post" action="<?php echo ProjectTheme_post_new_with_pid_stuff_thg($pid, '1');?>"> <?php do_action('ProjectTheme_step1_before_title'); ?> <!--////////// Project Title /////////////--> <li> <h2><?php echo __('Your Project Title', 'ProjectTheme'); ?>: <img src="../../images/help-icon.png" width="16" height="16" id="showhelp1"/></h2> <div id="help1">Your Project Title should be informative and brief.</div> <p><input type="text" style="width:90%;" class="do_input" name="project_title" value="Enter an informative & brief title..." onfocus="this.value = this.value=='Enter an informative & brief title...'?'':this.value;" onblur="this.value = this.value==''?'Enter an informative & brief title...':this.value;" /></p> <input type="submit" class="post-button" name="project_submit1" value="<?php _e("Next Step", 'ProjectTheme'); ?> &raquo;" /> </form> I am using a step by step form for users to fill in and I wanted to do a preview page, however I tried to use: if(isset($_POST['project_submit1'])){ $Name = $_POST['project_title']; echo "$Name"; } It shows up nothing.

    Read the article

  • How to add a checkbox for each row in Rails 3.2 index page?

    - by user938363
    We would like to add a checkbox to each row on Rails index page to flag for the row. This checkbox is not part of the object (no checkbox boolean in database). When the index page shows, a user can check the box to trigger an event for the row in following process: #objects/checkbox_index.html.erb <table> <tr> <th>CheckBox</th> <th>Object Name</th> <th>Object ID</th> </tr> <%= @objects.each do |obj| %> <tr> <td><%= checkbox %></td> <td><%= obj.name %></td> <td><%= obj.id %></td> </tr> <% end %> </table> In controller, the process will be like this: @objects.each do |obj| some_event if obj.checked end There are a couple of questions we don't quite understand: 1. How to declare an array checkbox variable on the form and link it to each row of obj? We have been using `attr_accessor` to declare var for a form. 2. How to retrieve each row on checkbox_index form and pass them back to controller? We are using simple_form for new/edit. Can anyone point me towards any good examples of this sort of behavior, or suggest what we should be thinking about? Many Thanks.

    Read the article

  • How do I select value from DropDown list in PHP??? Problem

    - by sandy
    Hello .. I want to know the error in this code The following code retrieves the names of the members of the database query in the dropdownlist But how do I know who you selected.... I want to send messages only to the members that selected form dropdown list <?php include ("connect.php"); $name = $_POST['sector_list']; echo $name ; ?> <form method="POST" action="" > <input type="hidden" name="sector" value="sector_list"> <select name="sector_list" class="inputstandard"> <option size ="40" value="default">send to </option> <?php $result = mysql_query('select * from members ') or die (mysql_error()); while ($row = mysql_fetch_assoc($result)) { echo '<option size ="40" value=" '. $row['MemberID'] . '" name="' . $row['MemberName']. '">' . $row['MemberName']. '</option>'; } ?> </select> </form> I hope somebody can help me

    Read the article

  • Why does this sql statement keep saying it is a boolean and not a parameter? (php/Mysql)

    - by ggfan
    In this statement, I am trying to see if there if the latest posting in the database that has the exact same title, price, city, state, detail. If there is, then it would say to the user that the exact post has been already made; if not then insert the posting into the dbc. (This is one type of check so that users can't accidentally post twice. This may not be the best check, but this statement error is annoying me, so I want it to work :)) Why won't this sql work? I think it's not letting the title=$title and not getting the value in the $title... ERROR: mysqli_num_rows() expects parameter 1 to be mysqli_result, boolean given in postad.php on line 365 //there is a form that users fill out that has title, price, city, etc <form> blah blah </form> //if users click submit, then does all the checks and if all okay, insert to dbc if (isset($_POST['submit'])) { // Grab the pposting data from the POST and gets rid of any funny stuff $title = mysqli_real_escape_string($dbc, trim($_POST['title'])); $price = mysqli_real_escape_string($dbc, trim($_POST['price'])); $city = mysqli_real_escape_string($dbc, trim($_POST['city'])); $state = mysqli_real_escape_string($dbc, trim($_POST['state'])); $detail = mysqli_real_escape_string($dbc, trim($_POST['detail'])); if (!is_numeric($price) && !empty($price)) { echo "<p class='error'>The price can only be numbers. No special characters, etc</p>"; } //Error problem...won't let me set title=$title, detail=$detail, etc. //this statement after all the checks so that none of the variables are empty $query="Select * FROM posting WHERE user_id={$_SESSION['user_id']} AND title=$title AND price=$price AND city=$city AND state=$state AND detail=$detail"; $data = mysqli_query($dbc, $query); if(mysqli_num_rows($data)==1) { echo "You already posted this ad. Most likely caused by refreshing too many times."; } }

    Read the article

  • Access database Need to prevent from approving overlapping OT.Second Try with modified request Not a programmer [on hold]

    - by user2512764
    Employees Signups on company Website for advance overtime line. Access table already has overtime signups which does not require user to add the time but it requires only to add location as approved. Since this table has field Employee name, Date, start time and End time and location, All the fields has the data except for location. In the data base I have created a form based on this table. Since the table already have most of the information User only has to add location in the form field in order to approve overtime. Once user approves an overtime line for example: User approves overtime for employee name 'John' which starts on 7/1/2013 at 0400-0800, location is successfully added. When user tries to add location for John again which might has the start time for 7/1/2013 at 0600=0900. Again we are not entering Start time, End time and date it is already in the table. we are only entering location as approval. Soon user enters the location for John in the form field, since there is a conflict with previously overtime line which has already been approved. program needs to check employee name, date and time in previously approved (Added location) overtime line and The location in current record needs to be deleted and go to next record. I hope I have explained it in understandable format. Thank You,

    Read the article

  • how to pass instance variables between handlers (routes) in sinatra (without flash, sessions, class variable or db)?

    - by jj_
    Say you have: get '/' do haml :index end get '/form' do haml :form end post '/form' do @message = params[:message] redirect to ('/') --- how to pass @message here? end I'd like the @message instance variable to be available (passed to) in "/" action as well, so I can show it in haml view. How can I do that without using session, flash, a @@class_variable, or db persistence ? I'd simply like to pass values as if I was working with passing values between methods. I don't want to use session cookies because user could have them turned off, I don't like it being a class variable which is exposed to all code, and I don't need to overhead of a db. Thanks edit: This is another question explaining a very easy way to deal with this in rails Passing parameters in rails redirect_to This is some more info i gathered around from forums. The following works for rails, i've tried it in Sinatra but no luck, but please try it, maybe I did something wrong, I don't know, and if this code help someone come up with a new idea, please share it If you are redirecting to action2 at the end of action1, just append the value to the end of the redirect: my_var = <some logic> redirect_to :action => 'action2', :my_var => my_var on the same thread another user proposes the folowing: def action1 redirect_to :action => 'action2', :value => params[:current_varaible] end def action2 puts params[:value].inspect end source: http://www.ruby-forum.com/topic/134953 Can something like this work in Sinatra? Thanks

    Read the article

  • Unwanted redirection after authentication

    - by jodaha
    Hello world! We have a form to submit ratings for a certain restaurant in a in our views/restaurants/show.html.erb. We only want logged in users to create new ratings. We put before_filter :login_required, :only = [ :new, :create ] (but we also tried only ":create") on top of our RatingsController. If we click the submit button after typing in the rating details we are prompted to log in (which is what we want). After filling in username and password and submitting the login form we get redirected back to e. g. /restaurants/36/ratings, but we want to be redirected back to where we came from - e. g. /restaurants/36/. We tried redirect_to(:back), but this redirects us back to the login form. Also the new rating does not get saved to the database. Any idea how we can change the redirection and how to make sure the rating gets saved? Thanks!

    Read the article

  • asp.net mvc postback

    - by user266909
    I have a controller with the following two Edit methods. The edit form displays correctly with all additional dropdown lists from the FormViewModel. However, when I changed some field values and submitted the form. None of the changed fields were saved. The fields in the postbask collection have default or null values. I have another edit form which update another table. On submit, the changed values are saved. Does anyone know why? // GET: /Transfers/Edit/5 public ActionResult Edit(int id) { Transfer transfer = myRepository.GetTransfer(id); if (transfer == null) return View("NotFound"); return View(new TransferFormViewModel(transfer)); } // // POST: /Transfers/Edit/5 [AcceptVerbs(HttpVerbs.Post)] public ActionResult Edit(int id, Transfer collection) { Transfer transfer = vetsRepository.GetTransfer(id); if (transfer == null) return View("NotFound"); else { try { UpdateModel(transfer); vetsRepository.Save(); return RedirectToAction("Details", new { id = transfer.TransfersID }); } catch { ModelState.AddModelErrors(transfer.GetRuleViolations()); return View(new TransferFormViewModel(transfer)); } } }

    Read the article

  • Configure TFS portal afterwards

    Update #1 January 8th, 2010: There is an updated post on this topic for Beta 2: http://www.ewaldhofman.nl/post/2009/12/10/Configure-TFS-portal-afterwards-Beta-2.aspx Update #2 October 10th, 2010: In the new Team Foundation Server Power Tools September 2010, there is now a command to create a portal. tfpt addprojectportal   Add or move portal for an existing team project Usage: tfpt addprojectportal /collection:uri                              /teamproject:"project name"                              /processtemplate:"template name"                              [/webapplication:"webappname"]                              [/relativepath:"pathfromwebapp"]                              [/validate]                              [/verbose] /collection Required. URL of Team Project Collection. /teamproject Required. Specifies the name of the team project. /processtemplate Required. Specifies that name of the process template. /webapplication The name of the SharePoint Web Application. Must also specify relativepath. /relativepath The path for the site relative to the root URL for the SharePoint Web Application. Must also specify webapplication. /validate Specifies that the user inputs are to be validated. If specified, only validation will be done and no portal setting will be changed. /verbose Switches on the verbose mode. I created a new Team Project in TFS 2010 Beta 1 and choose not to configure SharePoint during the creation of the Team Project. Of course I found out fairly quickly that a portal for TFS is very useful, especially the Iteration and the Product backlog workbooks and the dashboard reports. This blog describes how you can configure the sharepoint portal afterwards. Update: September 9th, 2009 Adding the portal afterwards is much easier as described below. Here are the steps Step 1: Create a new temporary project (with a SharePoint site for it). Open the Team Explorer Right click in the Team Explorer the root node (i.e. the project collection) Select "New team project" from the menu Walk throught he wizard and make sure you check the option to create the portal (which is by default checked) Step 2: Disable the site for the new project Open the Team Explorer Select the team project you created in step 1 In the menu click on Team -> Show Project Portal. In the menu click on Team -> Team Project Settings -> Portal Settings... The following dialog pops up Uncheck the option "Enable team project portal" Confirm the dialog with OK Step 3: Enable the site for the original one. Point it to the newly created site. Open the Team Explorer Select the team project you want to add the portal to In the menu open Team -> Team Project Settings -> Portal Settings... The same dialog as in step 2 pops up Check the option "Enable team project portal" Click on the "Configure URL" button The following dialog pops up   In the dialog select in the combobox of the web application the TFS server Enter in the Relative site path the text "sites/[Project Collection Name]/[Team Project Name created in step 1]" Confirm the "Specify an existing SharePoint Site" with OK Check the "Reports and dashboards refer to data for this team project" option Confirm the dialog "Project Portal Settings" with OK Step 4: Delete the temporary project you created. In Beta 1, I have found no way to delete a team project. Maybe it will be available in TFS 2010 Beta 2. Original post Step 1: Create new portal site Go to the sharepoint site of your project collection (/sites//default.aspx">/sites//default.aspx">http://<servername>/sites/<project_collection_name>/default.aspx) Click on the Site Actions at the left side of the screen and choose the option Site Settings In the site settings, choose the Sites and workspaces option Create a new site Enter the values for the Title, the description, the site address. And choose for the TFS2010 Agile Dashboard as template. Create the site, by clicking on the Create button Step 2: Integrate portal site with team project Open Visual Studio Open the Team Explorer (View -> Team Explorer) Select in the Team Explorer tool window the Team Project for which you are create a new portal Open the Project Portal Settings (Team -> Team Project Settings -> Portal Setings...) Check the Enable team project portal checkbox Click on Configure URL... You will get a new dialog as below Enter the url to the TFS server in the web application combobox And specify the relative site path: sites/<project collection>/<site name> Confirm with OK Check in the Project Portal Settings dialog the checkbox "Reports and dashboards refer to data for this team project" Confirm the settings with OK (this takes a while...) When you now browse to the portal, you will see that the dashboards are now showing up with the data for the current team project. Step 3: Download process template To get a copy of the documents that are default in a team project, we need to have a fresh set of files that are not attached to a team project yet. You can do that with the following steps. Start the Process Template Manager (Team -> Team Project Collection Settings -> Process Template Manager...) Choose the Agile process template and click on download Choose a folder to download Step 4: Add Product and Iteration backlog Go to the Team Explorer in Visual Studio Make sure the team project is in the list of team projects, and expand the team project Right click the Documents node, and choose New Document Library Enter "Shared Documents", and click on Add Right click the Shared Documents node and choose Upload Document Go the the file location where you stored the process template from step 3 and then navigate to the subdirectory "Agile Process Template 5.0\MSF for Agile Software Development v5.0\Windows SharePoint Services\Shared Documents\Project Management" Select in the Open Dialog the files "Iteration Backlog" and "Product Backlog", and click Open Step 5: Bind Iteration backlog workbook to the team project Right click on the "Iteration Backlog" file and select Edit, and confirm any warning messages Place your cursor in cell A1 of the Iteration backlog worksheet Switch to the Team ribbon and click New List. Select your Team Project and click Connect From the New List dialog, select the Iteration Backlog query in the Workbook Queries folder. The final step is to add a set of document properties that allow the workbook to communicate with the TFS reporting warehouse. Before we create the properties we need to collect some information about your project. The first piece of information comes from the table created in the previous step.  As you collect these properties, copy them into notepad so they can be used in later steps. Property How to retrieve the value? [Table name] Switch to the Design ribbon and select the Table Name value in the Properties portion of the ribbon [Project GUID] In the Visual Studio Team Explorer, right click your Team Project and select Properties.  Select the URL value and copy the GUID (long value with lots of characters) at the end of the URL [Team Project name] In the Properties dialog, select the Name field and copy the value [TFS server name] In the Properties dialog, select the Server Name field and copy the value [UPDATE] I have found that this is not correct: you need to specify the instance of your SQL Server. The value is used to create a connection to the TFS cube. Switch back to the Iteration Backlog workbook. Click the Office button and select Prepare – Properties. Click the Document Properties – Server drop down and select Advanced Properties. Switch to the Custom tab and add the following properties using the values you collected above. Variable name Value [Table name]_ASServerName [TFS server name] [Table name]_ASDatabase tfs_warehouse [Table name]_TeamProjectName [Team Project name] [Table name]_TeamProjectId [Project GUID] Click OK to close the properties dialog. It is possible that the Estimated Work (Hours) is showing the #REF! value. To resolve that change the formula with: =SUMIFS([Table name][Original Estimate]; [Table name][Iteration Path];CurrentIteration&"*";[Table name][Area Path];AreaPath&"*";[Table name][Work Item Type]; "Task") For example =SUMIFS(VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Original Estimate]; VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Iteration Path];CurrentIteration&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Area Path];AreaPath&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Work Item Type]; "Task") Also the Total Remaining Work in the Individual Capacity table may contain #REF! values. To resolve that change the formula with: =SUMIFS([Table name][Remaining Work]; [Table name][Iteration Path];CurrentIteration&"*";[Table name][Area Path];AreaPath&"*";[Table name][Assigned To];[Team Member];[Table name][Work Item Type]; "Task") For example =SUMIFS(VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Remaining Work]; VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Iteration Path];CurrentIteration&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Area Path];AreaPath&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Assigned To];[Team Member];VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Work Item Type]; "Task") Save and close the workbook. Step 6: Bind Product backlog workbook to the team project Repeat the steps for binding the Iteration backlog for thiw workbook too. In the worksheet Capacity, the formula of the Storypoints might be missing. You can resolve it with: =IF([Iteration]="";"";SUMIFS([Table name][Story Points];[Table name][Iteration Path];[Iteration]&"*")) Example =IF([Iteration]="";"";SUMIFS(VSTS_487f1e4c_db30_4302_b5e8_bd80195bc2ec[Story Points];VSTS_487f1e4c_db30_4302_b5e8_bd80195bc2ec[Iteration Path];[Iteration]&"*"))

    Read the article

  • Running a simple integration scenario using the Oracle Big Data Connectors on Hadoop/HDFS cluster

    - by hamsun
    Between the elephant ( the tradional image of the Hadoop framework) and the Oracle Iron Man (Big Data..) an english setter could be seen as the link to the right data Data, Data, Data, we are living in a world where data technology based on popular applications , search engines, Webservers, rich sms messages, email clients, weather forecasts and so on, have a predominant role in our life. More and more technologies are used to analyze/track our behavior, try to detect patterns, to propose us "the best/right user experience" from the Google Ad services, to Telco companies or large consumer sites (like Amazon:) ). The more we use all these technologies, the more we generate data, and thus there is a need of huge data marts and specific hardware/software servers (as the Exadata servers) in order to treat/analyze/understand the trends and offer new services to the users. Some of these "data feeds" are raw, unstructured data, and cannot be processed effectively by normal SQL queries. Large scale distributed processing was an emerging infrastructure need and the solution seemed to be the "collocation of compute nodes with the data", which in turn leaded to MapReduce parallel patterns and the development of the Hadoop framework, which is based on MapReduce and a distributed file system (HDFS) that runs on larger clusters of rather inexpensive servers. Several Oracle products are using the distributed / aggregation pattern for data calculation ( Coherence, NoSql, times ten ) so once that you are familiar with one of these technologies, lets says with coherence aggregators, you will find the whole Hadoop, MapReduce concept very similar. Oracle Big Data Appliance is based on the Cloudera Distribution (CDH), and the Oracle Big Data Connectors can be plugged on a Hadoop cluster running the CDH distribution or equivalent Hadoop clusters. In this paper, a "lab like" implementation of this concept is done on a single Linux X64 server, running an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0, and a single node Apache hadoop-1.2.1 HDFS cluster, using the SQL connector for HDFS. The whole setup is fairly simple: Install on a Linux x64 server ( or virtual box appliance) an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 server Get the Apache Hadoop distribution from: http://mir2.ovh.net/ftp.apache.org/dist/hadoop/common/hadoop-1.2.1. Get the Oracle Big Data Connectors from: http://www.oracle.com/technetwork/bdc/big-data-connectors/downloads/index.html?ssSourceSiteId=ocomen. Check the java version of your Linux server with the command: java -version java version "1.7.0_40" Java(TM) SE Runtime Environment (build 1.7.0_40-b43) Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode) Decompress the hadoop hadoop-1.2.1.tar.gz file to /u01/hadoop-1.2.1 Modify your .bash_profile export HADOOP_HOME=/u01/hadoop-1.2.1 export PATH=$PATH:$HADOOP_HOME/bin export HIVE_HOME=/u01/hive-0.11.0 export PATH=$PATH:$HADOOP_HOME/bin:$HIVE_HOME/bin (also see my sample .bash_profile) Set up ssh trust for Hadoop process, this is a mandatory step, in our case we have to establish a "local trust" as will are using a single node configuration copy the new public keys to the list of authorized keys connect and test the ssh setup to your localhost: We will run a "pseudo-Hadoop cluster", in what is called "local standalone mode", all the Hadoop java components are running in one Java process, this is enough for our demo purposes. We need to "fine tune" some Hadoop configuration files, we have to go at our $HADOOP_HOME/conf, and modify the files: core-site.xml hdfs-site.xml mapred-site.xml check that the hadoop binaries are referenced correctly from the command line by executing: hadoop -version As Hadoop is managing our "clustered HDFS" file system we have to create "the mount point" and format it , the mount point will be declared to core-site.xml as: The layout under the /u01/hadoop-1.2.1/data will be created and used by other hadoop components (MapReduce = /mapred/...) HDFS is using the /dfs/... layout structure format the HDFS hadoop file system: Start the java components for the HDFS system As an additional check, you can use the GUI Hadoop browsers to check the content of your HDFS configurations: Once our HDFS Hadoop setup is done you can use the HDFS file system to store data ( big data : )), and plug them back and forth to Oracle Databases by the means of the Big Data Connectors ( which is the next configuration step). You can create / use a Hive db, but in our case we will make a simple integration of "raw data" , through the creation of an External Table to a local Oracle instance ( on the same Linux box, we run the Hadoop HDFS one node cluster and one Oracle DB). Download some public "big data", I use the site: http://france.meteofrance.com/france/observations, from where I can get *.csv files for my big data simulations :). Here is the data layout of my example file: Download the Big Data Connector from the OTN (oraosch-2.2.0.zip), unzip it to your local file system (see picture below) Modify your environment in order to access the connector libraries , and make the following test: [oracle@dg1 bin]$./hdfs_stream Usage: hdfs_stream locationFile [oracle@dg1 bin]$ Load the data to the Hadoop hdfs file system: hadoop fs -mkdir bgtest_data hadoop fs -put obsFrance.txt bgtest_data/obsFrance.txt hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$ hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$hadoop fs -ls hdfs:///user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt Check the content of the HDFS with the browser UI: Start the Oracle database, and run the following script in order to create the Oracle database user, the Oracle directories for the Oracle Big Data Connector (dg1 it’s my own db id replace accordingly yours): #!/bin/bash export ORAENV_ASK=NO export ORACLE_SID=dg1 . oraenv sqlplus /nolog <<EOF CONNECT / AS sysdba; CREATE OR REPLACE DIRECTORY osch_bin_path AS '/u01/orahdfs-2.2.0/bin'; CREATE USER BGUSER IDENTIFIED BY oracle; GRANT CREATE SESSION, CREATE TABLE TO BGUSER; GRANT EXECUTE ON sys.utl_file TO BGUSER; GRANT READ, EXECUTE ON DIRECTORY osch_bin_path TO BGUSER; CREATE OR REPLACE DIRECTORY BGT_LOG_DIR as '/u01/BG_TEST/logs'; GRANT READ, WRITE ON DIRECTORY BGT_LOG_DIR to BGUSER; CREATE OR REPLACE DIRECTORY BGT_DATA_DIR as '/u01/BG_TEST/data'; GRANT READ, WRITE ON DIRECTORY BGT_DATA_DIR to BGUSER; EOF Put the following in a file named t3.sh and make it executable, hadoop jar $OSCH_HOME/jlib/orahdfs.jar \ oracle.hadoop.exttab.ExternalTable \ -D oracle.hadoop.exttab.tableName=BGTEST_DP_XTAB \ -D oracle.hadoop.exttab.defaultDirectory=BGT_DATA_DIR \ -D oracle.hadoop.exttab.dataPaths="hdfs:///user/oracle/bgtest_data/obsFrance.txt" \ -D oracle.hadoop.exttab.columnCount=7 \ -D oracle.hadoop.connection.url=jdbc:oracle:thin:@//localhost:1521/dg1 \ -D oracle.hadoop.connection.user=BGUSER \ -D oracle.hadoop.exttab.printStackTrace=true \ -createTable --noexecute then test the creation fo the external table with it: [oracle@dg1 samples]$ ./t3.sh ./t3.sh: line 2: /u01/orahdfs-2.2.0: Is a directory Oracle SQL Connector for HDFS Release 2.2.0 - Production Copyright (c) 2011, 2013, Oracle and/or its affiliates. All rights reserved. Enter Database Password:] The create table command was not executed. The following table would be created. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081035-74-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files would be created. osch-20131022081035-74-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt Then remove the --noexecute flag and create the external Oracle table for the Hadoop data. Check the results: The create table command succeeded. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081719-3239-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files were created. osch-20131022081719-3239-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt This is the view from the SQL Developer: and finally the number of lines in the oracle table, imported from our Hadoop HDFS cluster SQL select count(*) from "BGUSER"."BGTEST_DP_XTAB"; COUNT(*) ---------- 1151 In a next post we will integrate data from a Hive database, and try some ODI integrations with the ODI Big Data connector. Our simplistic approach is just a step to show you how these unstructured data world can be integrated to Oracle infrastructure. Hadoop, BigData, NoSql are great technologies, they are widely used and Oracle is offering a large integration infrastructure based on these services. Oracle University presents a complete curriculum on all the Oracle related technologies: NoSQL: Introduction to Oracle NoSQL Database Using Oracle NoSQL Database Big Data: Introduction to Big Data Oracle Big Data Essentials Oracle Big Data Overview Oracle Data Integrator: Oracle Data Integrator 12c: New Features Oracle Data Integrator 11g: Integration and Administration Oracle Data Integrator: Administration and Development Oracle Data Integrator 11g: Advanced Integration and Development Oracle Coherence 12c: Oracle Coherence 12c: New Features Oracle Coherence 12c: Share and Manage Data in Clusters Oracle Coherence 12c: Oracle GoldenGate 11g: Fundamentals for Oracle Oracle GoldenGate 11g: Fundamentals for SQL Server Oracle GoldenGate 11g Fundamentals for Oracle Oracle GoldenGate 11g Fundamentals for DB2 Oracle GoldenGate 11g Fundamentals for Teradata Oracle GoldenGate 11g Fundamentals for HP NonStop Oracle GoldenGate 11g Management Pack: Overview Oracle GoldenGate 11g Troubleshooting and Tuning Oracle GoldenGate 11g: Advanced Configuration for Oracle Other Resources: Apache Hadoop : http://hadoop.apache.org/ is the homepage for these technologies. "Hadoop Definitive Guide 3rdEdition" by Tom White is a classical lecture for people who want to know more about Hadoop , and some active "googling " will also give you some more references. About the author: Eugene Simos is based in France and joined Oracle through the BEA-Weblogic Acquisition, where he worked for the Professional Service, Support, end Education for major accounts across the EMEA Region. He worked in the banking sector, ATT, Telco companies giving him extensive experience on production environments. Eugen currently specializes in Oracle Fusion Middleware teaching an array of courses on Weblogic/Webcenter, Content,BPM /SOA/Identity-Security/GoldenGate/Virtualisation/Unified Comm Suite) throughout the EMEA region.

    Read the article

  • Windows Azure Mobile Services: New support for iOS apps, Facebook/Twitter/Google identity, Emails, SMS, Blobs, Service Bus and more

    - by ScottGu
    A few weeks ago I blogged about Windows Azure Mobile Services - a new capability in Windows Azure that makes it incredibly easy to connect your client and mobile applications to a scalable cloud backend. Earlier today we delivered a number of great improvements to Windows Azure Mobile Services.  New features include: iOS support – enabling you to connect iPhone and iPad apps to Mobile Services Facebook, Twitter, and Google authentication support with Mobile Services Blob, Table, Queue, and Service Bus support from within your Mobile Service Sending emails from your Mobile Service (in partnership with SendGrid) Sending SMS messages from your Mobile Service (in partnership with Twilio) Ability to deploy mobile services in the West US region All of these improvements are now live in production and available to start using immediately. Below are more details on them: iOS Support This week we delivered initial support for connecting iOS based devices (including iPhones and iPads) to Windows Azure Mobile Services.  Like the rest of our Windows Azure SDK, we are delivering the native iOS libraries to enable this under an open source (Apache 2.0) license on GitHub.  We’re excited to get your feedback on this new library through our forum and GitHub issues list, and we welcome contributions to the SDK. To create a new iOS app or connect an existing iOS app to your Mobile Service, simply select the “iOS” tab within the Quick Start view of a Mobile Service within the Windows Azure Portal – and then follow either the “Create a new iOS app” or “Connect to an existing iOS app” link below it: Clicking either of these links will expand and display step-by-step instructions for how to build an iOS application that connects with your Mobile Service: Read this getting started tutorial to walkthrough how you can build (in less than 5 minutes) a simple iOS “Todo List” app that stores data in Windows Azure.  Then follow the below tutorials to explore how to use the iOS client libraries to store data and authenticate users. Get Started with data in Mobile Services for iOS Get Started with authentication in Mobile Services for iOS Facebook, Twitter, and Google Authentication Support Our initial preview of Mobile Services supported the ability to authenticate users of mobile apps using Microsoft Accounts (formerly called Windows Live ID accounts).  This week we are adding the ability to also authenticate users using Facebook, Twitter, and Google credentials.  These are now supported with both Windows 8 apps as well as iOS apps (and a single app can support multiple forms of identity simultaneously – so you can offer your users a choice of how to login). The below tutorials walkthrough how to register your Mobile Service with an identity provider: How to register your app with Microsoft Account How to register your app with Facebook How to register your app with Twitter How to register your app with Google The tutorials above walkthrough how to obtain a client ID and a secret key from the identity provider. You can then click on the “Identity” tab of your Mobile Service (within the Windows Azure Portal) and save these values to enable server-side authentication with your Mobile Service: You can then write code within your client or mobile app to authenticate your users to the Mobile Service.  For example, below is the code you would write to have them login to the Mobile Service using their Facebook credentials: Windows Store App (using C#): var user = await App.MobileService                     .LoginAsync(MobileServiceAuthenticationProvider.Facebook); iOS app (using Objective C): UINavigationController *controller = [self.todoService.client     loginViewControllerWithProvider:@"facebook"     completion:^(MSUser *user, NSError *error) {        //... }]; Learn more about authenticating Mobile Services using Microsoft Account, Facebook, Twitter, and Google from these tutorials: Get started with authentication in Mobile Services for Windows Store (C#) Get started with authentication in Mobile Services for Windows Store (JavaScript) Get started with authentication in Mobile Services for iOS Using Windows Azure Blob, Tables and ServiceBus with your Mobile Services Mobile Services provide a simple but powerful way to add server logic using server scripts. These scripts are associated with the individual CRUD operations on your mobile service’s tables. Server scripts are great for data validation, custom authorization logic (e.g. does this user participate in this game session), augmenting CRUD operations, sending push notifications, and other similar scenarios.   Server scripts are written in JavaScript and are executed in a secure server-side scripting environment built using Node.js.  You can edit these scripts and save them on the server directly within the Windows Azure Portal: In this week’s release we have added the ability to work with other Windows Azure services from your Mobile Service server scripts.  This is supported using the existing “azure” module within the Windows Azure SDK for Node.js.  For example, the below code could be used in a Mobile Service script to obtain a reference to a Windows Azure Table (after which you could query it or insert data into it):     var azure = require('azure');     var tableService = azure.createTableService("<< account name >>",                                                 "<< access key >>"); Follow the tutorials on the Windows Azure Node.js dev center to learn more about working with Blob, Tables, Queues and Service Bus using the azure module. Sending emails from your Mobile Service In this week’s release we have also added the ability to easily send emails from your Mobile Service, building on our partnership with SendGrid. Whether you want to add a welcome email upon successful user registration, or make your app alert you of certain usage activities, you can do this now by sending email from Mobile Services server scripts. To get started, sign up for SendGrid account at http://sendgrid.com . Windows Azure customers receive a special offer of 25,000 free emails per month from SendGrid. To sign-up for this offer, or get more information, please visit http://www.sendgrid.com/azure.html . One you signed up, you can add the following script to your Mobile Service server scripts to send email via SendGrid service:     var sendgrid = new SendGrid('<< account name >>', '<< password >>');       sendgrid.send({         to: '<< enter email address here >>',         from: '<< enter from address here >>',         subject: 'New to-do item',         text: 'A new to-do was added: ' + item.text     }, function (success, message) {         if (!success) {             console.error(message);         }     }); Follow the Send email from Mobile Services with SendGrid tutorial to learn more. Sending SMS messages from your Mobile Service SMS is a key communication medium for mobile apps - it comes in handy if you want your app to send users a confirmation code during registration, allow your users to invite their friends to install your app or reach out to mobile users without a smartphone. Using Mobile Service server scripts and Twilio’s REST API, you can now easily send SMS messages to your app.  To get started, sign up for Twilio account. Windows Azure customers receive 1000 free text messages when using Twilio and Windows Azure together. Once signed up, you can add the following to your Mobile Service server scripts to send SMS messages:     var httpRequest = require('request');     var account_sid = "<< account SID >>";     var auth_token = "<< auth token >>";       // Create the request body     var body = "From=" + from + "&To=" + to + "&Body=" + message;       // Make the HTTP request to Twilio     httpRequest.post({         url: "https://" + account_sid + ":" + auth_token +              "@api.twilio.com/2010-04-01/Accounts/" + account_sid + "/SMS/Messages.json",         headers: { 'content-type': 'application/x-www-form-urlencoded' },         body: body     }, function (err, resp, body) {         console.log(body);     }); I’m excited to be speaking at the TwilioCon conference this week, and will be showcasing some of the cool scenarios you can now enable with Twilio and Windows Azure Mobile Services. Mobile Services availability in West US region Our initial preview of Windows Azure Mobile Services was only supported in the US East region of Windows Azure.  As with every Windows Azure service, overtime we will extend Mobile Services to all Windows Azure regions. With this week’s preview update we’ve added support so that you can now create your Mobile Service in the West US region as well: Summary The above features are all now live in production and are available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using Mobile Services today. Visit the Windows Azure Mobile Developer Center to learn more about how to build apps with Mobile Services. We’ll have even more new features and enhancements coming later this week – including .NET 4.5 support for Windows Azure Web Sites.  Keep an eye out on my blog for details as new features become available. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Mouse Clicks, Reactive Extensions and StreamInsight Mashup

    I had an hour spare this afternoon so I wanted to have another play with Reactive Extensions in .Net and StreamInsight.  I also didn’t want to simply use a console window as a way of gathering events so I decided to use a windows form instead. The task I set myself was this. Whenever I click on my form I want to subscribe to the event and output its location to the console window and also the timestamp of the event.  In addition to this I want to know for every mouse click I do, how many mouse clicks have happened in the last 5 seconds. The second point here is really interesting.  I have often found this when working with people on problems.  It is how you ask the question that determines how you tackle the problem.  I will show 2 ways of possibly answering the second question depending on how the question was interpreted. As a side effect of this example I will show how time in StreamInsight can stand still.  This is an important concept and we can see it in the output later. Now to the code.  I will break it all down in this blogpost but you can download the solution and see it all together. I created a Console application and then instantiate a windows form.   frm = new Form(); Thread g = new Thread(CallUI); g.SetApartmentState(ApartmentState.STA); g.Start();   Call UI looks like this   static void CallUI() { System.Windows.Forms.Application.Run(frm); frm.Activate(); frm.BringToFront(); }   Now what we need to do is create an observable from the MouseClick event on the form.  For this we use the Reactive Extensions.   var lblevt = Observable.FromEvent<MouseEventArgs>(frm, "MouseClick").Timestamp();   As mentioned earlier I have two objectives in this example and to solve the first I am going to again use the Reactive extensions.  Let’s subscribe to the MouseClick event and output the location and timestamp to the console. lblevt.Subscribe(evt => { Console.WriteLine("Clicked: {0}, {1} ", evt.Value.EventArgs.Location,evt.Timestamp); }); That should take care of obective #1 but what about the second objective.  For that we need some temporal windowing and this means StreamInsight.  First we need to turn our Observable collection of MouseClick events into a PointStream Server s = Server.Create("Default"); Microsoft.ComplexEventProcessing.Application a = s.CreateApplication("MouseClicks"); var input = lblevt.ToPointStream( a, evt => PointEvent.CreateInsert( evt.Timestamp, new { loc = evt.Value.EventArgs.Location.ToString(), ts = evt.Timestamp.ToLocalTime().ToString() }), AdvanceTimeSettings.IncreasingStartTime);   Now that we have created out PointStream we need to do something with it and this is where we get to our second objective.  It is pretty clear that we want some kind of windowing but what? Here is one way of doing it.  It might not be what you wanted but again it is how the second objective is interpreted   var q = from i in input.TumblingWindow(TimeSpan.FromSeconds(5), HoppingWindowOutputPolicy.ClipToWindowEnd) select new { CountOfClicks = i.Count() };   The above code creates tumbling windows of 5 seconds and counts the number of events in the windows.  If there are no events in the window then no result is output.  Likewise until an event (MouseClick) is issued then we do not see anything in the output (that is not strictly true because it is the CTI strapped to our MouseClick events that flush the events through the StreamInsight engine not the events themselves).  This approach is centred around the windows and not the events.  Until the windows complete and a CTI is issued then no events are pushed through. An alternate way of answering our second question is below   var q = from i in input.AlterEventDuration(evt => TimeSpan.FromSeconds(5)).SnapshotWindow(SnapshotWindowOutputPolicy.Clip) select new { CountOfClicks = i.Count() };   In this code we extend the duration of each MouseClick to five seconds.  We then create  Snapshot Windows over those events.  Snapshot windows are discussed in detail here.  With this solution we are centred around the events.  It is the events that are driving the output.  Let’s have a look at the output from this solution as it may be a little confusing. First though let me show how we get the output from StreamInsight into the Console window. foreach (var x in q.ToPointEnumerable().Where(e => e.EventKind != EventKind.Cti)) { Console.WriteLine(x.Payload.CountOfClicks); }   Ok so now to the output.   The table at the top shows the output from our routine and the table at the bottom helps to explain the output.  One of the things that will help as well is, you will note that for our PointStream we set the issuing of CTIs to be IncreasingStartTime.  What this means is that the CTI is placed right at the start of the event so will not flush the event with which it was issued but will flush those prior to it.  In the bottom table the Blue fill is where we issued a click.  Yellow fill is the duration and boundaries of our events.  The numbers at the bottom indicate the count of events   Clicked 22:40:16                                 Clicked 23:40:18                                 1                                   Clicked 23:40:20                                 2                                   Clicked 23:40:22                                 3                                   2                                   Clicked 23:40:24                                 3                                   2                                   Clicked 23:40:32                                 3                                   2                                   1                                                                                                         secs 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32                                                                                                                                                                                                                         counts   1   2 3 2 3 2 3   2   1           What we can see here in the output is that the counts include all the end edges that have occurred between the mouse clicks.  If we look specifically at the mouse click at 22:40:32. then we see that 3 events are returned to us. These include the following End Edge count at 22:40:25 End Edge count at 22:40:27 End Edge count at 22:40:29 Another thing we notice is that until we actually issue a CTI at 22:40:32 then those last 3 snapshot window counts will never be reported. Hopefully this has helped to explain  a few concepts around StreamInsight and the IObservable() pattern.   You can download this solution from here and play.  You will need the Reactive Framework from here and StreamInsight 1.1

    Read the article

  • What would you do if you just had this code dumped in your lap?

    - by chickeninabiscuit
    Man, I just had this project given to me - expand on this they say. This is an example of ONE function: <?php //500+ lines of pure wonder. function page_content_vc($content) { global $_DBH, $_TPL, $_SET; $_SET['ignoreTimezone'] = true; lu_CheckUpdateLogin(); if($_SESSION['dash']['VC']['switch'] == 'unmanned' || $_SESSION['dash']['VC']['switch'] == 'touchscreen') { if($content['page_name'] != 'vc') { header('Location: /vc/'); die(); } } if($_GET['l']) { unset($_SESSION['dash']['VC']); if($loc_id = lu_GetFieldValue('ID', 'Location', $_GET['l'])) { if(lu_CheckPermissions('vc', $loc_id)) { $timezone = lu_GetFieldValue('Time Zone', 'Location', $loc_id, 'ID'); if(strlen($timezone) > 0) { $_SESSION['time_zone'] = $timezone; } $_SESSION['dash']['VC']['loc_ID'] = $loc_id; header('Location: /vc/'); die(); } } } if($_SESSION['dash']['VC']['loc_ID']) { $timezone = lu_GetFieldValue('Time Zone', 'Location', $_SESSION['dash']['VC']['loc_ID'], 'ID'); if(strlen($timezone) > 0) { $_SESSION['time_zone'] = $timezone; } $loc_id = $_SESSION['dash']['VC']['loc_ID']; $org_id = lu_GetFieldValue('record_ID', 'Location', $loc_id); $_TPL->assign('loc_id', $loc_id); $location_name = lu_GetFieldValue('Location Name', 'Location', $loc_id); $_TPL->assign('LocationName', $location_name); $customer_name = lu_GetFieldValue('Customer Name', 'Organisation', $org_id); $_TPL->assign('CustomerName', $customer_name); $enable_visitor_snap = lu_GetFieldValue('VisitorSnap', 'Location', $loc_id); $_TPL->assign('EnableVisitorSnap', $enable_visitor_snap); $lacps = explode("\n", lu_GetFieldValue('Location Access Control Point', 'Location', $loc_id)); array_walk($lacps, 'trim_value'); if(count($lacps) > 0) { if(count($lacps) == 1) { $_SESSION['dash']['VC']['lacp'] = $lacps[0]; } else { if($_GET['changeLACP'] && in_array($_GET['changeLACP'], $lacps)) { $_SESSION['dash']['VC']['lacp'] = $_GET['changeLACP']; header('Location: /vc/'); die(); } else if(!in_array($_SESSION['dash']['VC']['lacp'], $lacps)) { $_SESSION['dash']['VC']['lacp'] = $lacps[0]; } $_TPL->assign('LACP_array', $lacps); } $_TPL->assign('current_LACP', $_SESSION['dash']['VC']['lacp']); $_TPL->assign('showContractorSearch', true); /* if($contractorStaff = lu_GetTableRow('ContractorStaff', $org_id, 'record_ID', 'record_Inactive != "checked"')) { foreach($contractorStaff['rows'] as $contractor) { $lacp_rights = lu_OrganiseCustomDataFunctionMultiselect($contractor[lu_GetFieldName('Location Access Rights', 'ContractorStaff')]); if(in_array($_SESSION['dash']['VC']['lacp'], $lacp_rights)) { $_TPL->assign('showContractorSearch', true); } } } */ } $selectedOptions = explode(',', lu_GetFieldValue('Included Fields', 'Location', $_SESSION['dash']['VC']['loc_ID'])); $newOptions = array(); foreach($selectedOptions as $selOption) { $so_array = explode('|', $selOption, 2); if(count($so_array) > 1) { $newOptions[$so_array[0]] = $so_array[1]; } else { $newOptions[$so_array[0]] = "Both"; } } if($newOptions[lu_GetFieldName('Expected Length of Visit', 'Visitor')]) { $alert = false; if($visitors = lu_OrganiseVisitors( lu_GetTableRow('Visitor', 'checked', lu_GetFieldName('Checked In', 'Visitor'), lu_GetFieldName('Location for Visit', 'Visitor').'="'.$_SESSION['dash']['VC']['loc_ID'].'" AND '.lu_GetFieldName('Checked Out', 'Visitor').' != "checked"'), false, true, true)) { foreach($visitors['rows'] as $key => $visitor) { if($visitor['expected'] && $visitor['expected'] + (60*30) < time()) { $alert = true; } } } if($alert == true) { $_TPL->assign('showAlert', 'red'); } else { //$_TPL->assign('showAlert', 'green'); } } $_TPL->assign('switch', $_SESSION['dash']['VC']['switch']); if($_SESSION['dash']['VC']['switch'] == 'touchscreen') { $_TPL->assign('VC_unmanned', true); } if($_GET['check'] == 'in') { if($_SESSION['dash']['VC']['switch'] == 'touchscreen') { lu_CheckInTouchScreen(); } else { lu_CheckIn(); } } else if($_GET['check'] == 'out') { if($_SESSION['dash']['VC']['switch'] == 'touchscreen') { lu_CheckOutTouchScreen(); } else { lu_CheckOut(); } } else if($_GET['switch'] == 'unmanned') { $_SESSION['dash']['VC']['switch'] = 'unmanned'; if($_GET['printing'] == true && (lu_GetFieldValue('Printing', 'Location', $_SESSION['dash']['VC']['loc_ID']) != "No" && lu_GetFieldValue('Printing', 'Location', $_SESSION['dash']['VC']['loc_ID']) != "")) { $_SESSION['dash']['VC']['printing'] = true; } else { $_SESSION['dash']['VC']['printing'] = false; } header('Location: /vc/'); die(); } else if($_GET['switch'] == 'touchscreen') { $_SESSION['dash']['VC']['switch'] = 'touchscreen'; if($_GET['printing'] == true && (lu_GetFieldValue('Printing', 'Location', $_SESSION['dash']['VC']['loc_ID']) != "No" && lu_GetFieldValue('Printing', 'Location', $_SESSION['dash']['VC']['loc_ID']) != "")) { $_SESSION['dash']['VC']['printing'] = true; } else { $_SESSION['dash']['VC']['printing'] = false; } header('Location: /vc/'); die(); } else if($_GET['switch'] == 'manned') { if($_POST['password']) { if(md5($_POST['password']) == $_SESSION['dash']['password']) { unset($_SESSION['dash']['VC']['switch']); //setcookie('email', "", time() - 3600); //setcookie('location', "", time() - 3600); header('Location: /vc/'); die(); } else { $_TPL->assign('switchLoginError', 'Incorrect Password'); } } $_TPL->assign('switchLogin', 'true'); } else if($_GET['m'] == 'visitor') { lu_ModifyVisitorVC(); } else if($_GET['m'] == 'enote') { lu_ModifyEnoteVC(); } else if($_GET['m'] == 'medical') { lu_ModifyMedicalVC(); } else if($_GET['print'] == 'label' && $_GET['v']) { lu_PrintLabelVC(); } else { unset($_SESSION['dash']['VC']['checkin']); unset($_SESSION['dash']['VC']['checkout']); $_TPL->assign('icon', 'GroupCheckin'); if($_SESSION['dash']['VC']['switch'] != 'unmanned' && $_SESSION['dash']['VC']['switch'] != 'touchscreen') { $staff_ids = array(); if($staffs = lu_GetTableRow('Staff', $_SESSION['dash']['VC']['loc_ID'], 'record_ID')) { foreach($staffs['rows'] as $staff) { $staff_ids[] = $staff['ID']; } } if($_GET['view'] == "tomorrow") { $dateStart = date('Y-m-d', mktime(0, 0, 0, date("m") , date("d")+1, date("Y"))); $dateEnd = date('Y-m-d', mktime(0, 0, 0, date("m") , date("d")+1, date("Y"))); } else if($_GET['view'] == "month") { $dateStart = date('Y-m-d', mktime(0, 0, 0, date("m"), date("d"), date("Y"))); $dateEnd = date('Y-m-d', mktime(0, 0, 0, date("m"), date("d")+30, date("Y"))); } else if($_GET['view'] == "week") { $dateStart = date('Y-m-d', mktime(0, 0, 0, date("m"), date("d"), date("Y"))); $dateEnd = date('Y-m-d', mktime(0, 0, 0, date("m"), date("d")+7, date("Y"))); } else { $dateStart = date('Y-m-d'); $dateEnd = date('Y-m-d'); } if(lu_GetFieldValue('Enable Survey', 'Location', $_SESSION['dash']['VC']['loc_ID']) == 'checked' && lu_GetFieldValue('Add Survey', 'Location', $_SESSION['dash']['VC']['loc_ID']) == 'checked') { $_TPL->assign('enableSurvey', true); } //lu_GetFieldName('Checked In', 'Visitor') //!= "checked" //date('d/m/Y'), lu_GetFieldName('Date of Visit', 'Visitor') if($visitors = lu_OrganiseVisitors(lu_GetTableRow('Visitor', $_SESSION['dash']['VC']['loc_ID'], lu_GetFieldName('Location for Visit', 'Visitor'), lu_GetFieldName('Checked In', 'Visitor').' != "checked" AND '.lu_GetFieldName('Checked Out', 'Visitor').' != "checked" AND '.lu_GetFieldName('Date of Visit', 'Visitor').' >= "'.$dateStart.'" AND '.lu_GetFieldName('Date of Visit', 'Visitor').' <= "'.$dateEnd.'"'))) { foreach($visitors['days'] as $day => $visitors_day) { foreach($visitors_day['rows'] as $key => $visitor) { $visitors['days'][$day]['rows'][$key]['visiting'] = lu_GetTableRow('Staff', $visitor['record_ID'], 'ID'); $visitors['days'][$day]['rows'][$key]['visiting']['notify'] = $_DBH->getRow('SELECT * FROM lu_notification WHERE ent_ID = "'.$visitor['record_ID'].'"'); } } //array_dump($visitors); $_TPL->assign('visitors', $visitors); } if($_GET['conGroup']) { if($_GET['action'] == 'add') { $_SESSION['dash']['VC']['conGroup'][$_GET['conGroup']] = $_GET['conGroup']; } else { unset($_SESSION['dash']['VC']['conGroup'][$_GET['conGroup']]); } } if(count($_SESSION['dash']['VC']['conGroup']) > 0) { if($conGroupResult = lu_GetTableRow('ContractorStaff', '1', '1', ' ID IN ('.implode(',', $_SESSION['dash']['VC']['conGroup']).')')) { if($_POST['_submit'] == 'Check-In Group >>') { $form = lu_GetForm('VisitorStandard'); $standarddata = array(); foreach($form['items'] as $key=>$item) { $standarddata[$key] = $_POST[lu_GetFieldName($item['name'], 'Visitor')]; } foreach($conGroupResult['rows'] as $conStaff) { $data = $standarddata; foreach($form['items'] as $key=>$item) { if($key != 'ID' && $key != 'record_ID' && $conStaff[lu_GetFieldName(lu_GetNameField($key, 'Visitor'), 'ContractorStaff')]) { $data[$key] = $conStaff[lu_GetFieldName(lu_GetNameField($key, 'Visitor'), 'ContractorStaff')]; } } $data['record_ID'] = $data[lu_GetFieldName('Visiting', 'Visitor')]; $data[lu_GetFieldName('Date of Visit', 'Visitor')] = date('Y-m-d'); $data[lu_GetFieldName('Time of Visit', 'Visitor')] = date('H:i'); $data[lu_GetFieldName('Checked In', 'Visitor')] = 'checked'; $data[lu_GetFieldName('Location for Visit', 'Visitor')] = $_SESSION['dash']['VC']['loc_ID']; $data[lu_GetFieldName('ConStaff ID', 'Visitor')] = $conStaff['ID']; $data[lu_GetFieldName('From', 'Visitor')] = lu_GetFieldValue('Legal Name', 'Contractor', $conStaff[lu_GetFieldName('Contractor', 'ContractorStaff')]); $id = lu_UpdateData($form, $data); lu_VisitorCheckIn($id); //array_dump($data); //array_dump($id); } unset($_SESSION['dash']['VC']['conGroup']); header('Location: /vc/'); die(); } if(count($conGroupResult['rows'])) { foreach($conGroupResult['rows'] as $key => $cstaff) { $conGroupResult['rows'][$key]['contractor'] = lu_GetTableRow('Contractor', $cstaff[lu_GetFieldName('Contractor', 'ContractorStaff')], 'ID'); } $_TPL->assign('conGroupResult', $conGroupResult); } $conGroupForm = lu_GetForm('VisitorConGroup'); $conGroupForm = lu_OrganiseVisitorForm($conGroupForm, $_SESSION['dash']['VC']['loc_ID'], 'Contractor'); $secure_options_array = lu_GetSecureOptions($org_id); if($secure_options_array[$_SESSION['dash']['VC']['loc_ID']]) { $conGroupForm['items'][lu_GetFieldName('Secure Area', 'Visitor')]['options']['values'] = $secure_options_array[$_SESSION['dash']['VC']['loc_ID']]; $conGroupForm['items'][lu_GetFieldName('Secure Area', 'Visitor')]['name'] = 'Secure Area'; } else { unset($conGroupForm['items'][lu_GetFieldName('Secure Area', 'Visitor')]); } if($secure_options_array) { $form['items'][lu_GetFieldName('Secure Area', 'Visitor')]['options']['values'] = $secure_options_array; $form['items'][lu_GetFieldName('Secure Area', 'Visitor')]['name'] = 'Secure Area'; } else { unset($form['items'][lu_GetFieldName('Secure Area', 'Visitor')]); } $_TPL->assign('conGroupForm', $conGroupForm); $_TPL->assign('hideFormCancel', true); } } if($_GET['searchVisitors']) { $_TPL->assign('searchVisitorsQuery', $_GET['searchVisitors']); $where = ''; if($_GET['searchVisitorsIn'] == 'Yes') { $where .= ' AND '.lu_GetFieldName('Checked In', 'Visitor').' = "checked"'; $_TPL->assign('searchVisitorsIn', 'Yes'); } else { $where .= ' AND '.lu_GetFieldName('Checked In', 'Visitor').' != "checked"'; $_TPL->assign('searchVisitorsIn', 'No'); } if($_GET['searchVisitorsOut'] == 'Yes') { $where = ''; $where .= ' AND '.lu_GetFieldName('Checked Out', 'Visitor').' = "checked"'; $_TPL->assign('searchVisitorsOut', 'Yes'); } else { $where .= ' AND '.lu_GetFieldName('Checked Out', 'Visitor').' != "checked"'; $_TPL->assign('searchVisitorsOut', 'No'); } if($searchVisitors = lu_OrganiseVisitors(lu_GetTableRow('Visitor', $_GET['searchVisitors'], '#search#', lu_GetFieldName('Location for Visit', 'Visitor').'="'.$_SESSION['dash']['VC']['loc_ID'].'"'.$where))) { foreach($searchVisitors['rows'] as $key => $visitor) { $searchVisitors['rows'][$key]['visiting'] = lu_GetTableRow('Staff', $visitor['record_ID'], 'ID'); } $_TPL->assign('searchVisitors', $searchVisitors); } else { $_TPL->assign('searchVisitorsNotFound', true); } } else if($_GET['searchStaff']) { if($_POST['staff_id']) { if(lu_CheckPermissions('staff', $_POST['staff_id'])) { $_DBH->query('UPDATE '.lu_GetTableName('Staff').' SET '.lu_GetFieldName('Current Location', 'Staff').' = "'.$_POST['current_location'].'" WHERE ID="'.$_POST['staff_id'].'"'); } } $locations = lu_GetTableRow('Location', $org_id, 'record_ID'); if(count($locations['rows']) > 1) { $_TPL->assign('staffLocations', $locations); } $loc_ids = array(); foreach($locations['rows'] as $location) { $loc_ids[] = $location['ID']; } // array_dump($locations); // array_dump($_POST); $_TPL->assign('searchStaffQuery', $_GET['searchStaff']); $where = ' AND record_Inactive != "checked"'; if($_GET['searchStaffIn'] == 'Yes' && $_GET['searchStaffOut'] != 'Yes') { $where .= ' AND ('.lu_GetFieldName('Staff Status', 'Staff').' = "" OR '.lu_GetFieldName('Staff Status', 'Staff').' = "On-Site")'. $_TPL->assign('searchStaffIn', 'Yes'); $_TPL->assign('searchStaffOut', 'No'); } else if($_GET['searchStaffOut'] == 'Yes' && $_GET['searchStaffIn'] != 'Yes') { $where .= ' AND ('.lu_GetFieldName('Staff Status', 'Staff').' != "" AND '.lu_GetFieldName('Staff Status', 'Staff').' != "On-Site")'. $_TPL->assign('searchStaffOut', 'Yes'); $_TPL->assign('searchStaffIn', 'No'); } else { $_TPL->assign('searchStaffOut', 'Yes'); $_TPL->assign('searchStaffIn', 'Yes'); } if($searchStaffs = lu_GetTableRow('Staff', $_GET['searchStaff'], '#search#', 'record_ID IN ('.implode(',', $loc_ids).')'.$where, lu_GetFieldName('First Name', 'Staff').','.lu_GetFieldName('Surname', 'Staff'))) { $_TPL->assign('searchStaffs', $searchStaffs); } else { $_TPL->assign('searchStaffNotFound', true); } } else if($_GET['searchContractor']) { $_TPL->assign('searchContractorQuery', $_GET['searchContractor']); //$where = ' AND '.lu_GetTableName('ContractorStaff').'.record_Inactive != "checked"'; $where = ' '; if($_GET['searchContractorIn'] == 'Yes' && $_GET['searchContractorOut'] != 'Yes') { $where .= ' AND ('.lu_GetFieldName('Onsite Status', 'ContractorStaff').' = "Onsite")'; $_TPL->assign('searchContractorIn', 'Yes'); $_TPL->assign('searchContractorOut', 'No'); } else if($_GET['searchContractorOut'] == 'Yes' && $_GET['searchContractorIn'] != 'Yes') { $where .= ' AND ('.lu_GetFieldName('Onsite Status', 'ContractorStaff').' != "Onsite")'. $_TPL->assign('searchContractorOut', 'Yes'); $_TPL->assign('searchContractorIn', 'No'); } else { $_TPL->assign('searchContractorOut', 'Yes'); $_TPL->assign('searchContractorIn', 'Yes'); } $join = 'LEFT JOIN '.lu_GetTableName('Contractor').' ON '.lu_GetTableName('Contractor').'.ID = '.lu_GetTableName('ContractorStaff').'.'.lu_GetFieldName('Contractor', 'ContractorStaff'); $extrasearch = array ( lu_GetTableName('Contractor').'.'.lu_GetFieldName('Legal Name', 'Contractor') ); if($searchContractorResult = lu_GetTableRow('ContractorStaff', $_GET['searchContractor'], '#search#', lu_GetTableName('ContractorStaff').'.record_ID = "'.$org_id.'" '.$where, lu_GetFieldName('First Name', 'ContractorStaff').','.lu_GetFieldName('Surname', 'ContractorStaff'), $join, $extrasearch)) { /* foreach($searchContractorResult['rows'] as $key=>$contractor) { $lacp_rights = lu_OrganiseCustomDataFunctionMultiselect($contractor[lu_GetFieldName('Location Access Rights', 'ContractorStaff')]); if(!in_array($_SESSION['dash']['VC']['lacp'], $lacp_rights)) { unset($searchContractorResult['rows'][$key]); } } */ if(count($searchContractorResult['rows'])) { foreach($searchContractorResult['rows'] as $key => $cstaff) { /* if($cstaff[lu_GetFieldName('Onsite_Status', 'Contractor')] == 'Onsite')) { if($visitor['rows'][0][lu_GetFieldName('ConStaff ID', 'Visitor')]) { $_DBH->query('UPDATE '.lu_GetTableName('ContractorStaff').' SET '.lu_GetFieldName('Onsite Status', 'ContractorStaff').' = "" WHERE ID="'.$visitor['rows'][0][lu_GetFieldName('ConStaff ID', 'Visitor')].'"'); } } */ if($cstaff[lu_GetFieldName('SACN Expiry Date', 'ContractorStaff')] != '0000-00-00') { if(strtotime($cstaff[lu_GetFieldName('SACN Expiry Date', 'ContractorStaff')]) < time()) { $searchContractorResult['rows'][$key]['sacn_expiry'] = true; } else { $searchContractorResult['rows'][$key]['sacn_expiry'] = false; } } else { $searchContractorResult['rows'][$key]['sacn_expiry'] = false; } if($cstaff[lu_GetFieldName('Induction Valid Until', 'ContractorStaff')] != '0000-00-00') { if(strtotime($cstaff[lu_GetFieldName('Induction Valid Until', 'ContractorStaff')]) < time()) { $searchContractorResult['rows'][$key]['induction_expiry'] = true; } else { $searchContractorResult['rows'][$key]['induction_expiry'] = false; } } else { $searchContractorResult['rows'][$key]['induction_expiry'] = false; } $searchContractorResult['rows'][$key]['contractor'] = lu_GetTableRow('Contractor', $cstaff[lu_GetFieldName('Contractor', 'ContractorStaff')], 'ID'); } $_TPL->assign('searchContractorResult', $searchContractorResult); } else { $_TPL->assign('searchContractorNotFound', true); } } else { $_TPL->assign('searchContractorNotFound', true); } } $occupancy = array(); $occupancy['staffNumber'] = $_DBH->getOne('SELECT count(*) FROM '.lu_GetTableName('Staff').' WHERE record_ID = "'.$_SESSION['dash']['VC']['loc_ID'].'" AND record_Inactive != "checked" AND '.lu_GetFieldName('Ignore Counts', 'Staff').' != "checked"'); $occupancy['staffNumberOnsite']= $_DBH->getOne( 'SELECT count(*) FROM '.lu_GetTableName('Staff').' WHERE ( (record_ID = "'.$_SESSION['dash']['VC']['loc_ID'].'" AND ('.lu_GetFieldName('Staff Status', 'Staff').' = "" OR '.lu_GetFieldName('Staff Status', 'Staff').' = "On-Site")) OR '.lu_GetFieldName('Current Location', 'Staff').' = "'.$_SESSION['dash']['VC']['loc_ID'].'") AND record_Inactive != "checked" AND '.lu_GetFieldName('Ignore Counts', 'Staff').' != "checked"'); $occupancy['visitorsOnsite'] = $_DBH->getOne('SELECT count(*) FROM '.lu_GetTableName('Visitor').' WHERE '.lu_GetFieldName('Location for Visit', 'Visitor').' = "'.$_SESSION['dash']['VC']['loc_ID'].'" AND '.lu_GetFieldName('Checked In', 'Visitor').' = "checked" AND '.lu_GetFieldName('Checked Out', 'Visitor').' != "checked"'); $_TPL->assign('occupancy', $occupancy); if($enotes = lu_GetTableRow('Enote', $org_id, 'record_ID', lu_GetFieldName('Note Emailed', 'Enote').' = "0000-00-00" AND '.lu_GetFieldName('Note Passed On', 'Enote').' != "Yes"')) { $_TPL->assign('EnoteNotice', true); } if($medical = lu_GetTableRow('MedicalRoom', $_SESSION['dash']['VC']['loc_ID'], 'record_ID', 'record_Inactive != "Yes"')) { $_TPL->assign('MedicalNotice', true); } if(lu_GetFieldValue('Printing', 'Location', $_SESSION['dash']['VC']['loc_ID']) != "No" && lu_GetFieldValue('Printing', 'Location', $_SESSION['dash']['VC']['loc_ID']) != "") { $_TPL->assign('UnmannedPrinting', true); } } else { if($_SESSION['dash']['VC']['printing'] == true) { $_TPL->assign('UnmannedPrinting', true); } } // enable if contractor check-in buttons should be enabled if(lu_GetFieldValue('Enable Contractor Check In', 'Location', $_SESSION['dash']['VC']['loc_ID']) == "checked") { $_TPL->assign('ContractorCheckin', true); } } if($_SESSION['dash']['entity_id'] && $_GET['fixupCon'] == 'true') { $conStaffs = lu_GetTableRow('ContractorStaff', $_SESSION['dash']['ModifyConStaffs']['org_ID'], 'record_ID', '', lu_GetFieldName('First Name', 'ContractorStaff').','.lu_GetFieldName('Surname', 'ContractorStaff')); foreach($conStaffs['rows'] as $key => $cstaff) { if($cstaff[lu_GetFieldName('Site Access Card Number', 'ContractorStaff')] && $cstaff[lu_GetFieldName('Site Access Card Type', 'ContractorStaff')]) { echo $cstaff['ID'].' '; $_DBH->query('UPDATE '.lu_GetTableName('Visitor').' SET '.lu_GetFieldName('Site Access Card Number', 'Visitor').' = "'.$cstaff[lu_GetFieldName('Site Access Card Number', 'ContractorStaff')].'", '.lu_GetFieldName('Site Access Card Type', 'Visitor').' = "'.$cstaff[lu_GetFieldName('Site Access Card Type', 'ContractorStaff')].'" WHERE '.lu_GetFieldName('ConStaff ID', 'Visitor').'="'.$cstaff['ID'].'"'); } } } } else { if($_SESSION['dash']['staffs']) { foreach($_SESSION['dash']['staffs']['rows'] as $staff) { if($staff[lu_GetFieldName('Reception Manager', 'Staff')] == 'checked') { $loc_id = $staff['record_ID']; unset($_SESSION['dash']['VC']); if($loc_id = lu_GetFieldValue('ID', 'Location', $loc_id)) { $_SESSION['dash']['VC']['loc_ID'] = $loc_id; header('Location: /vc/'); die(); } } } } $_TPL->assign('mode', 'public'); } $content['page_content'] = $_TPL->fetch('modules/vc.htm'); return $content; } ?> die();die();die();die();die(); This question will probably be closed - i just need some support from my coding brothers and sisters. *SOB*

    Read the article

  • C++ - Conway's Game of Life & Stepping Backwards

    - by Gabe
    I was able to create a version Conway's Game of Life that either stepped forward each click, or just ran forward using a timer. (I'm doing this using Qt.) Now, I need to be able to save all previous game grids, so that I can step backwards by clicking a button. I'm trying to use a stack, and it seems like I'm pushing the old gridcells onto the stack correctly. But when I run it in QT, the grids don't change when I click BACK. I've tried different things for the last three hours, to no avail. Any ideas? gridwindow.cpp - My problem should be in here somewhere. Probably the handleBack() func. #include <iostream> #include "gridwindow.h" using namespace std; // Constructor for window. It constructs the three portions of the GUI and lays them out vertically. GridWindow::GridWindow(QWidget *parent,int rows,int cols) : QWidget(parent) { QHBoxLayout *header = setupHeader(); // Setup the title at the top. QGridLayout *grid = setupGrid(rows,cols); // Setup the grid of colored cells in the middle. QHBoxLayout *buttonRow = setupButtonRow(); // Setup the row of buttons across the bottom. QVBoxLayout *layout = new QVBoxLayout(); // Puts everything together. layout->addLayout(header); layout->addLayout(grid); layout->addLayout(buttonRow); setLayout(layout); } // Destructor. GridWindow::~GridWindow() { delete title; } // Builds header section of the GUI. QHBoxLayout* GridWindow::setupHeader() { QHBoxLayout *header = new QHBoxLayout(); // Creates horizontal box. header->setAlignment(Qt::AlignHCenter); this->title = new QLabel("CONWAY'S GAME OF LIFE",this); // Creates big, bold, centered label (title): "Conway's Game of Life." this->title->setAlignment(Qt::AlignHCenter); this->title->setFont(QFont("Arial", 32, QFont::Bold)); header->addWidget(this->title); // Adds widget to layout. return header; // Returns header to grid window. } // Builds the grid of cells. This method populates the grid's 2D array of GridCells with MxN cells. QGridLayout* GridWindow::setupGrid(int rows,int cols) { isRunning = false; QGridLayout *grid = new QGridLayout(); // Creates grid layout. grid->setHorizontalSpacing(0); // No empty spaces. Cells should be contiguous. grid->setVerticalSpacing(0); grid->setSpacing(0); grid->setAlignment(Qt::AlignHCenter); for(int i=0; i < rows; i++) //Each row is a vector of grid cells. { std::vector<GridCell*> row; // Creates new vector for current row. cells.push_back(row); for(int j=0; j < cols; j++) { GridCell *cell = new GridCell(); // Creates and adds new cell to row. cells.at(i).push_back(cell); grid->addWidget(cell,i,j); // Adds to cell to grid layout. Column expands vertically. grid->setColumnStretch(j,1); } grid->setRowStretch(i,1); // Sets row expansion horizontally. } return grid; // Returns grid. } // Builds footer section of the GUI. QHBoxLayout* GridWindow::setupButtonRow() { QHBoxLayout *buttonRow = new QHBoxLayout(); // Creates horizontal box for buttons. buttonRow->setAlignment(Qt::AlignHCenter); // Clear Button - Clears cell; sets them all to DEAD/white. QPushButton *clearButton = new QPushButton("CLEAR"); clearButton->setFixedSize(100,25); connect(clearButton, SIGNAL(clicked()), this, SLOT(handlePause())); // Pauses timer before clearing. connect(clearButton, SIGNAL(clicked()), this, SLOT(handleClear())); // Connects to clear function to make all cells DEAD/white. buttonRow->addWidget(clearButton); // Forward Button - Steps one step forward. QPushButton *forwardButton = new QPushButton("FORWARD"); forwardButton->setFixedSize(100,25); connect(forwardButton, SIGNAL(clicked()), this, SLOT(handleForward())); // Signals to handleForward function.. buttonRow->addWidget(forwardButton); // Back Button - Steps one step backward. QPushButton *backButton = new QPushButton("BACK"); backButton->setFixedSize(100,25); connect(backButton, SIGNAL(clicked()), this, SLOT(handleBack())); // Signals to handleBack funciton. buttonRow->addWidget(backButton); // Start Button - Starts game when user clicks. Or, resumes game after being paused. QPushButton *startButton = new QPushButton("START/RESUME"); startButton->setFixedSize(100,25); connect(startButton, SIGNAL(clicked()), this, SLOT(handlePause())); // Deletes current timer if there is one. Then restarts everything. connect(startButton, SIGNAL(clicked()), this, SLOT(handleStart())); // Signals to handleStart function. buttonRow->addWidget(startButton); // Pause Button - Pauses simulation of game. QPushButton *pauseButton = new QPushButton("PAUSE"); pauseButton->setFixedSize(100,25); connect(pauseButton, SIGNAL(clicked()), this, SLOT(handlePause())); // Signals to pause function which pauses timer. buttonRow->addWidget(pauseButton); // Quit Button - Exits program. QPushButton *quitButton = new QPushButton("EXIT"); quitButton->setFixedSize(100,25); connect(quitButton, SIGNAL(clicked()), qApp, SLOT(quit())); // Signals the quit slot which ends the program. buttonRow->addWidget(quitButton); return buttonRow; // Returns bottom of layout. } /* SLOT method for handling clicks on the "clear" button. Receives "clicked" signals on the "Clear" button and sets all cells to DEAD. */ void GridWindow::handleClear() { for(unsigned int row=0; row < cells.size(); row++) // Loops through current rows' cells. { for(unsigned int col=0; col < cells[row].size(); col++) // Loops through the rows'columns' cells. { GridCell *cell = cells[row][col]; // Grab the current cell & set its value to dead. cell->setType(DEAD); } } } /* SLOT method for handling clicks on the "start" button. Receives "clicked" signals on the "start" button and begins game simulation. */ void GridWindow::handleStart() { isRunning = true; // It is running. Sets isRunning to true. this->timer = new QTimer(this); // Creates new timer. connect(this->timer, SIGNAL(timeout()), this, SLOT(timerFired())); // Connect "timerFired" method class to the "timeout" signal fired by the timer. this->timer->start(500); // Timer to fire every 500 milliseconds. } /* SLOT method for handling clicks on the "pause" button. Receives "clicked" signals on the "pause" button and stops the game simulation. */ void GridWindow::handlePause() { if(isRunning) // If it is running... this->timer->stop(); // Stops the timer. isRunning = false; // Set to false. } void GridWindow::handleForward() { if(isRunning); // If it's running, do nothing. else timerFired(); // It not running, step forward one step. } void GridWindow::handleBack() { std::vector<std::vector<GridCell*> > cells2; if(isRunning); // If it's running, do nothing. else if(backStack.empty()) cout << "EMPTYYY" << endl; else { cells2 = backStack.peek(); for (unsigned int f = 0; f < cells.size(); f++) // Loop through cells' rows. { for (unsigned int g = 0; g < cells.at(f).size(); g++) // Loop through cells columns. { cells[f][g]->setType(cells2[f][g]->getType()); // Set cells[f][g]'s type to cells2[f][g]'s type. } } cout << "PRE=POP" << endl; backStack.pop(); cout << "OYYYY" << endl; } } // Accessor method - Gets the 2D vector of grid cells. std::vector<std::vector<GridCell*> >& GridWindow::getCells() { return this->cells; } /* TimerFired function: 1) 2D-Vector cells2 is declared. 2) cells2 is initliazed with loops/push_backs so that all its cells are DEAD. 3) We loop through cells, and count the number of LIVE neighbors next to a given cell. --> Depending on how many cells are living, we choose if the cell should be LIVE or DEAD in the next simulation, according to the rules. -----> We save the cell type in cell2 at the same indice (the same row and column cell in cells2). 4) After check all the cells (and save the next round values in cells 2), we set cells's gridcells equal to cells2 gridcells. --> This causes the cells to be redrawn with cells2 types (white or black). */ void GridWindow::timerFired() { backStack.push(cells); std::vector<std::vector<GridCell*> > cells2; // Holds new values for 2D vector. These are the next simulation round of cell types. for(unsigned int i = 0; i < cells.size(); i++) // Loop through the rows of cells2. (Same size as cells' rows.) { vector<GridCell*> row; // Creates Gridcell* vector to push_back into cells2. cells2.push_back(row); // Pushes back row vectors into cells2. for(unsigned int j = 0; j < cells[i].size(); j++) // Loop through the columns (the cells in each row). { GridCell *cell = new GridCell(); // Creates new GridCell. cell->setType(DEAD); // Sets cell type to DEAD/white. cells2.at(i).push_back(cell); // Pushes back the DEAD cell into cells2. } // This makes a gridwindow the same size as cells with all DEAD cells. } for (unsigned int m = 0; m < cells.size(); m++) // Loop through cells' rows. { for (unsigned int n = 0; n < cells.at(m).size(); n++) // Loop through cells' columns. { unsigned int neighbors = 0; // Counter for number of LIVE neighbors for a given cell. // We know check all different variations of cells[i][j] to count the number of living neighbors for each cell. // We check m > 0 and/or n > 0 to make sure we don't access negative indexes (ex: cells[-1][0].) // We check m < size to make sure we don't try to access rows out of the vector (ex: row 5, if only 4 rows). // We check n < row size to make sure we don't access column item out of the vector (ex: 10th item in a column of only 9 items). // If we find that the Type = 1 (it is LIVE), then we add 1 to the neighbor. // Else - we add nothing to the neighbor counter. // Neighbor is the number of LIVE cells next to the current cell. if(m > 0 && n > 0) { if (cells[m-1][n-1]->getType() == 1) neighbors += 1; } if(m > 0) { if (cells[m-1][n]->getType() == 1) neighbors += 1; if(n < (cells.at(m).size() - 1)) { if (cells[m-1][n+1]->getType() == 1) neighbors += 1; } } if(n > 0) { if (cells[m][n-1]->getType() == 1) neighbors += 1; if(m < (cells.size() - 1)) { if (cells[m+1][n-1]->getType() == 1) neighbors += 1; } } if(n < (cells.at(m).size() - 1)) { if (cells[m][n+1]->getType() == 1) neighbors += 1; } if(m < (cells.size() - 1)) { if (cells[m+1][n]->getType() == 1) neighbors += 1; } if(m < (cells.size() - 1) && n < (cells.at(m).size() - 1)) { if (cells[m+1][n+1]->getType() == 1) neighbors += 1; } // Done checking number of neighbors for cells[m][n] // Now we change cells2 if it should switch in the next simulation step. // cells2 holds the values of what cells should be on the next iteration of the game. // We can't change cells right now, or it would through off our other cell values. // Apply game rules to cells: Create new, updated grid with the roundtwo vector. // Note - LIVE is 1; DEAD is 0. if (cells[m][n]->getType() == 1 && neighbors < 2) // If cell is LIVE and has less than 2 LIVE neighbors -> Set to DEAD. cells2[m][n]->setType(DEAD); else if (cells[m][n]->getType() == 1 && neighbors > 3) // If cell is LIVE and has more than 3 LIVE neighbors -> Set to DEAD. cells2[m][n]->setType(DEAD); else if (cells[m][n]->getType() == 1 && (neighbors == 2 || neighbors == 3)) // If cell is LIVE and has 2 or 3 LIVE neighbors -> Set to LIVE. cells2[m][n]->setType(LIVE); else if (cells[m][n]->getType() == 0 && neighbors == 3) // If cell is DEAD and has 3 LIVE neighbors -> Set to LIVE. cells2[m][n]->setType(LIVE); } } // Now we've gone through all of cells, and saved the new values in cells2. // Now we loop through cells and set all the cells' types to those of cells2. for (unsigned int f = 0; f < cells.size(); f++) // Loop through cells' rows. { for (unsigned int g = 0; g < cells.at(f).size(); g++) // Loop through cells columns. { cells[f][g]->setType(cells2[f][g]->getType()); // Set cells[f][g]'s type to cells2[f][g]'s type. } } } stack.h - Here's my stack. #ifndef STACK_H_ #define STACK_H_ #include <iostream> #include "node.h" template <typename T> class Stack { private: Node<T>* top; int listSize; public: Stack(); int size() const; bool empty() const; void push(const T& value); void pop(); T& peek() const; }; template <typename T> Stack<T>::Stack() : top(NULL) { listSize = 0; } template <typename T> int Stack<T>::size() const { return listSize; } template <typename T> bool Stack<T>::empty() const { if(listSize == 0) return true; else return false; } template <typename T> void Stack<T>::push(const T& value) { Node<T>* newOne = new Node<T>(value); newOne->next = top; top = newOne; listSize++; } template <typename T> void Stack<T>::pop() { Node<T>* oldT = top; top = top->next; delete oldT; listSize--; } template <typename T> T& Stack<T>::peek() const { return top->data; // Returns data in top item. } #endif gridcell.cpp - Gridcell implementation #include <iostream> #include "gridcell.h" using namespace std; // Constructor: Creates a grid cell. GridCell::GridCell(QWidget *parent) : QFrame(parent) { this->type = DEAD; // Default: Cell is DEAD (white). setFrameStyle(QFrame::Box); // Set the frame style. This is what gives each box its black border. this->button = new QPushButton(this); //Creates button that fills entirety of each grid cell. this->button->setSizePolicy(QSizePolicy::Expanding,QSizePolicy::Expanding); // Expands button to fill space. this->button->setMinimumSize(19,19); //width,height // Min height and width of button. QHBoxLayout *layout = new QHBoxLayout(); //Creates a simple layout to hold our button and add the button to it. layout->addWidget(this->button); setLayout(layout); layout->setStretchFactor(this->button,1); // Lets the buttons expand all the way to the edges of the current frame with no space leftover layout->setContentsMargins(0,0,0,0); layout->setSpacing(0); connect(this->button,SIGNAL(clicked()),this,SLOT(handleClick())); // Connects clicked signal with handleClick slot. redrawCell(); // Calls function to redraw (set new type for) the cell. } // Basic destructor. GridCell::~GridCell() { delete this->button; } // Accessor for the cell type. CellType GridCell::getType() const { return(this->type); } // Mutator for the cell type. Also has the side effect of causing the cell to be redrawn on the GUI. void GridCell::setType(CellType type) { this->type = type; redrawCell(); // Sets type and redraws cell. } // Handler slot for button clicks. This method is called whenever the user clicks on this cell in the grid. void GridCell::handleClick() { // When clicked on... if(this->type == DEAD) // If type is DEAD (white), change to LIVE (black). type = LIVE; else type = DEAD; // If type is LIVE (black), change to DEAD (white). setType(type); // Sets new type (color). setType Calls redrawCell() to recolor. } // Method to check cell type and return the color of that type. Qt::GlobalColor GridCell::getColorForCellType() { switch(this->type) { default: case DEAD: return Qt::white; case LIVE: return Qt::black; } } // Helper method. Forces current cell to be redrawn on the GUI. Called whenever the setType method is invoked. void GridCell::redrawCell() { Qt::GlobalColor gc = getColorForCellType(); //Find out what color this cell should be. this->button->setPalette(QPalette(gc,gc)); //Force the button in the cell to be the proper color. this->button->setAutoFillBackground(true); this->button->setFlat(true); //Force QT to NOT draw the borders on the button } Thanks a lot. Let me know if you need anything else.

    Read the article

  • Calculix Data Visualiser using QT

    - by Ann
    I am doing a project on CalculiX data visualizor,using Qt.I 've to draw the structure and after giving force the displacement should be shawn as variation in color.I chose HSV coloring,but while executing I got an error message:"QColor::from Hsv:HSV parameters out of range".The code is: DataViz1::DataViz1(QWidget *parent) : QWidget(parent), ui(new Ui::DataViz1) { DArea = new QGLScreen(this); DArea-setGeometry(QRect(10,10,700,600)); //TODO This values are feeded by user dfile="/home/41407/color.txt";//input file with displacement mfile="/home/41407/mesh21.txt";//input file nodeId="*NODE"; elId="*ELEMENT"; DataId="displ"; parseMfile(); parseDfile(); DArea->Nodes=Nodes; DArea->Elements=Elements; DArea->Data=Data; DArea->fillColorArray(); //printf("Colr is %d",DArea->pickColor(-11.02,0));fflush(stdout); ui->setupUi(this); } DataViz1::~DataViz1() { delete ui; } void DataViz1::parseMfile() { QFile file(mfile); if (!file.open(QIODevice::ReadOnly | QIODevice::Text)) return; int node_end=0; QTextStream in(&file); in.skipWhiteSpace(); while (!in.atEnd()) { QString line = in.readLine(); if(line.startsWith(nodeId))//Node block in Mfile { while(1) { line = in.readLine(); if(line.startsWith(elId)) { break; } Nodes< while(1) { line = in.readLine(); Elements<<line; //printf("Element is %s\n",line.toLocal8Bit().constData());fflush(stdout); if(in.atEnd()) break; } } } } void DataViz1::parseDfile() { QFile file(dfile); if (!file.open(QIODevice::ReadOnly | QIODevice::Text)) return; int node_end=0; QTextStream in(&file); in.skipWhiteSpace(); while (!in.atEnd()) { QString line = in.readLine(); if(line.startsWith(DataId)) { continue; } line = in.readLine(); Data< } /......................................................................../ include "qglscreen.h" include GLfloat LightAmbient[]= { 0.5f, 0.5f, 0.5f, 1.0f }; GLfloat LightDiffuse[]= { 1.0f, 1.0f, 1.0f, 1.0f }; GLfloat LightPosition[]= { 0.0f, 0.0f, 2.0f, 1.0f }; QGLScreen::QGLScreen(QWidget *parent):QGLWidget(QGLFormat(QGL::SampleBuffers), parent) { clearColor = Qt::black; xRot = 0; yRot = 0; zRot = 0; ifdef QT_OPENGL_ES_2 program = 0; endif //TODO user input ElType="HE8"; DType="SolidFrame"; axis="X"; } QGLScreen::~QGLScreen() { } QSize QGLScreen::minimumSizeHint() const { return QSize(50, 50); } QSize QGLScreen::sizeHint() const { return QSize(200, 200); } void QGLScreen::setClearColor(const QColor &color) { clearColor = color; updateGL(); } void QGLScreen::initializeGL() { xRot=0; yRot=0; zRot=0; scaling = 1.0; /* select clearing (background) color */ glClearColor (0.0, 0.0, 0.0, 0.0); glMatrixMode(GL_PROJECTION); glLoadIdentity(); // glViewport(0,0,10,10); glOrtho(-10.0, +10.0, -10.0, +10.0, -10.0,+10.0); glEnable (GL_LINE_SMOOTH); glHint (GL_LINE_SMOOTH_HINT, GL_DONT_CARE); } void QGLScreen::wheel1() { scaling1 += .0025; count2++; update(); } void QGLScreen::wheel2() { if(count2-14) { scaling1 -= .0025; count2--; update(); } } void QGLScreen::drawModel(int x1,int y1,int x2,int y2) { makeCurrent(); QStringList Cnode,Celement; for (int i = 0; i < Elements.size(); ++i) { Celement=Elements.at(i).split(","); // printf("Element is %s",Celement.at(0).toLocal8Bit().constData());fflush(stdout); //printf("Node at el is %s\n",(findNode(Celement.at(1).toInt())).at(1).toLocal8Bit().constData()); fflush(stdout); if(ElType=="HE8") { //First four nodes float ENX1=(findNode(Celement.at(1).toInt())).at(1).toDouble(); float ENX2=(findNode(Celement.at(2).toInt())).at(1).toDouble(); float ENX3=(findNode(Celement.at(3).toInt())).at(1).toDouble(); float ENX4=(findNode(Celement.at(4).toInt())).at(1).toDouble(); float ENY1=(findNode(Celement.at(1).toInt())).at(2).toDouble(); float ENY2=(findNode(Celement.at(2).toInt())).at(2).toDouble(); float ENY3=(findNode(Celement.at(3).toInt())).at(2).toDouble(); float ENY4=(findNode(Celement.at(4).toInt())).at(2).toDouble(); float ENZ1=(findNode(Celement.at(1).toInt())).at(3).toDouble(); float ENZ2=(findNode(Celement.at(2).toInt())).at(3).toDouble(); float ENZ3=(findNode(Celement.at(3).toInt())).at(3).toDouble(); float ENZ4=(findNode(Celement.at(4).toInt())).at(3).toDouble(); //Second four Nodes float ENX5=(findNode(Celement.at(5).toInt())).at(1).toDouble(); float ENX6=(findNode(Celement.at(6).toInt())).at(1).toDouble(); float ENX7=(findNode(Celement.at(7).toInt())).at(1).toDouble(); float ENX8=(findNode(Celement.at(8).toInt())).at(1).toDouble(); float ENY5=(findNode(Celement.at(5).toInt())).at(2).toDouble(); float ENY6=(findNode(Celement.at(6).toInt())).at(2).toDouble(); float ENY7=(findNode(Celement.at(7).toInt())).at(2).toDouble(); float ENY8=(findNode(Celement.at(8).toInt())).at(2).toDouble(); float ENZ5=(findNode(Celement.at(5).toInt())).at(3).toDouble(); float ENZ6=(findNode(Celement.at(6).toInt())).at(3).toDouble(); float ENZ7=(findNode(Celement.at(7).toInt())).at(3).toDouble(); float ENZ8=(findNode(Celement.at(8).toInt())).at(3).toDouble(); //Identify Colors GLfloat ENC[8][3]; for(int k=1;k<8;k++) { int hsv=pickColor(findData(Celement.at(k).toInt()).toDouble(),0); //printf("hsv is %d=",hsv);fflush(stdout); getRGB(hsv); //printf("%d*%d*%d\n",red,green,blue); //ENC[k]={red,green,blue}; ENC[k][0]=red; ENC[k][1]=green; ENC[k][2]=blue; } //Plot the first four direct loop if(DType=="WireFrame"){ glBegin(GL_LINE_LOOP); glColor3f(255,0,0); glVertex3f(ENX1,ENY1,ENZ1); glColor3f(255,0,0); glVertex3f(ENX2,ENY2,ENZ2); glColor3f(255,0,0); glVertex3f(ENX3,ENY3,ENZ3); glColor3f(255,0,0); glVertex3f(ENX4,ENY4,ENZ4); glEnd(); //Plot the second four direct loop glBegin(GL_LINE_LOOP); glColor3f(0,0,255); glVertex3f(ENX5,ENY5,ENZ5); glColor3f(0,0,255); glVertex3f(ENX6,ENY6,ENZ6); glColor3f(0,0,255); glVertex3f(ENX7,ENY7,ENZ7); glColor3f(0,0,255); glVertex3f(ENX8,ENY8,ENZ8); glEnd(); //Plot the interconnections glBegin(GL_LINE); glColor3f(150,150,150); glVertex3f(ENX1,ENY1,ENZ1); glVertex3f(ENX5,ENY5,ENZ5); glEnd(); glBegin(GL_LINE); glColor3f(150,150,150); glVertex3f(ENX2,ENY2,ENZ2); glVertex3f(ENX6,ENY6,ENZ6); glEnd(); glBegin(GL_LINE); glColor3f(150,150,150); glVertex3f(ENX3,ENY3,ENZ3); glVertex3f(ENX7,ENY7,ENZ7); glEnd(); glBegin(GL_LINE); glColor3f(150,150,150); glVertex3f(ENX4,ENY4,ENZ4); glVertex3f(ENX8,ENY8,ENZ8); glEnd(); } if(DType=="SolidFrame") { glBegin(GL_QUADS); glColor3fv(ENC[1]); glVertex3f(ENX1,ENY1,ENZ1); glColor3fv(ENC[2]); glVertex3f(ENX2,ENY2,ENZ2); glColor3fv(ENC[3]); glVertex3f(ENX3,ENY3,ENZ3); glColor3fv(ENC[4]); glVertex3f(ENX4,ENY4,ENZ4); glEnd(); //break; glBegin(GL_QUADS); glColor3fv(ENC[5]); glVertex3f(ENX5,ENY5,ENZ5); glColor3fv(ENC[6]); glVertex3f(ENX6,ENY6,ENZ6); glColor3fv(ENC[7]); glVertex3f(ENX7,ENY7,ENZ7); glColor3fv(ENC[8]); glVertex3f(ENX8,ENY8,ENZ8); glEnd(); glBegin(GL_QUAD_STRIP); glColor3fv(ENC[1]); glVertex3f(ENX1,ENY1,ENZ1); glColor3fv(ENC[5]); glVertex3f(ENX5,ENY5,ENZ5); glColor3fv(ENC[2]); glVertex3f(ENX2,ENY2,ENZ2); glColor3fv(ENC[6]); glVertex3f(ENX6,ENY6,ENZ6); glEnd(); glBegin(GL_QUAD_STRIP); glColor3fv(ENC[3]); glVertex3f(ENX3,ENY3,ENZ3); glColor3fv(ENC[7]); glVertex3f(ENX7,ENY7,ENZ7); glColor3fv(ENC[4]); glVertex3f(ENX4,ENY4,ENZ4); glColor3fv(ENC[8]); glVertex3f(ENX8,ENY8,ENZ8); glEnd(); glBegin(GL_QUAD_STRIP); glColor3fv(ENC[2]); glVertex3f(ENX2,ENY2,ENZ2); glColor3fv(ENC[6]); glVertex3f(ENX6,ENY6,ENZ6); glColor3fv(ENC[3]); glVertex3f(ENX3,ENY3,ENZ3); glColor3fv(ENC[7]); glVertex3f(ENX7,ENY7,ENZ7); glEnd(); glBegin(GL_QUAD_STRIP); glColor3fv(ENC[1]); glVertex3f(ENX1,ENY1,ENZ1); glColor3fv(ENC[5]); glVertex3f(ENX5,ENY5,ENZ5); glColor3fv(ENC[4]); glVertex3f(ENX4,ENY4,ENZ4); glColor3fv(ENC[8]); glVertex3f(ENX8,ENY8,ENZ8); glEnd(); } } } } QStringList QGLScreen::findNode(int element) { QStringList Temp; for (int i = 0; i < Nodes.size(); ++i) { Temp=Nodes.at(i).split(","); if(Temp.at(0).toInt()==element) { break; } } return Temp; } QString QGLScreen::findData(int Node) { QString Temp; QRegExp sep("\s+"); for (int i = 0; i < Data.size(); ++i) { if((Data.at(i).split("\t")).at(0).section(sep,1,1).toInt()==Node) { if(axis=="X") { Temp=Data.at(i).split("\t").at(0).section(sep,2,2); } if(axis=="Y") { Temp=Data.at(i).split("\t").at(0).section(sep,3,3); } if(axis=="Z") { Temp=Data.at(i).split("\t").at(0).section(sep,4,4); } break; } } return Temp; } void QGLScreen::fillColorArray() { QString Temp1,Temp2,Temp3; double d1s=0,d2s=0,d3s=0,d1l=0,d2l=0,d3l=0,diff=0; QRegExp sep("\\s+"); for (int i = 0; i < Data.size(); ++i) { Temp1=(Data.at(i).split("\t")).at(0).section(sep,2,2); if(d1s>Temp1.toDouble()) { d1s=Temp1.toDouble(); } if(d1l<Temp1.toDouble()) { d1l=Temp1.toDouble(); } Temp2=(Data.at(i).split("\t")).at(0).section(sep,3,3); if(d2s>Temp2.toDouble()) { d2s=Temp2.toDouble(); } if(d2l<Temp2.toDouble()) { d2l=Temp2.toDouble(); } Temp3=(Data.at(i).split("\t")).at(0).section(sep,4,4); if(d3s>Temp3.toDouble()) { d3s=Temp3.toDouble(); } if(d3l<Temp3.toDouble()) { d3l=Temp3.toDouble(); } // printf("data is %s",Temp.toLocal8Bit().constData());fflush(stdout); } color[0][0]=d1l; for(int i=1;i<360;i++) { //printf("Large is%f small is %f",d1l,d1s); diff=d1l-d1s; if(d1l==0&&d1s<0) color[0][i]=color[0][i-1]-diff/360; else if(d1l>0&&d1s==0) color[0][i]=color[0][i-1]+diff/360; else if(d1l>0&&d1s<0) color[0][i]=color[0][i-1]-diff/360; diff=d2l-d2s; if(d2l==0&&d2s<0) color[1][i]=color[1][i-1]-diff/360; else if(d2l>0&&d2s==0) color[1][i]=color[1][i-1]+diff/360; else if(d2l>0&&d2s<0) color[1][i]=color[1][i-1]-diff/360; diff=d3l-d3s; if(d3l==0&&d3s<0) color[2][i]=color[2][i-1]-diff/360; else if(d3l>0&&d3s==0) color[2][i]=color[2][i-1]+diff/360; else if(d3l>0&&d3s<0) color[2][i]=color[2][i-1]-diff/360; } //for(int i=0;i<360;i++) printf("%d %f %f %f\n",i,color[0][i],color[1][i],color[2][i]); } int QGLScreen::pickColor(double data,int Did) { int i,pos; if(axis=="X")Did=0; if(axis=="Y")Did=1; if(axis=="Z")Did=2; //printf("%f data is",data);fflush(stdout); for(int i=0;i<360;i++) { if(color[Did][i]<data && data>color[Did][i+1]) { //printf("Orginal dat is %f Data found is %f and pos %d\n",data,color[Did][i],i);fflush(stdout); pos=i; break; } } return pos; } void QGLScreen::getRGB(int hsv) { QColor c; c.setHsv(hsv,255,255,255); QColor r=QColor::fromHsv(hsv,255,255); red=r.red(); green=r.green(); blue=r.blue(); } void QGLScreen::paintGL() { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glPushAttrib(GL_ALL_ATTRIB_BITS); glMatrixMode(GL_PROJECTION); glPushMatrix(); glLoadIdentity(); GLfloat x = 3.0 * GLfloat(width()) / height(); glOrtho(-x, +x, -3.0, +3.0, 4.0, 15.0); glMatrixMode(GL_MODELVIEW); glPushMatrix(); glLoadIdentity(); glTranslatef(0.0, 0.0, -10.0); glScalef(scaling, scaling, scaling); glRotatef(xRot, 1.0, 0.0, 0.0); glRotatef(yRot, 0.0, 1.0, 0.0); glRotatef(zRot, 0.0, 0.0, 1.0); drawModel(0,0,1,1); /* don't wait! * start processing buffered OpenGL routines */ glFlush (); } /void QGLScreen::zoom1() { scaling+=.05; update(); }/ void QGLScreen::resizeGL(int width, int height) { int side = qMin(width, height); glViewport((width - side) / 2, (height - side) / 2, side, side); #if !defined(QT_OPENGL_ES_2) glMatrixMode(GL_PROJECTION); glLoadIdentity(); #ifndef QT_OPENGL_ES glOrtho(-0.5, +0.5, +0.5, -0.5, 4.0, 15.0); #else glOrthof(-0.5, +0.5, +0.5, -0.5, 4.0, 15.0); #endif glMatrixMode(GL_MODELVIEW); #endif } void QGLScreen::mousePressEvent(QMouseEvent *event) { lastPos = event-pos(); } void QGLScreen::mouseMoveEvent(QMouseEvent *event) { GLfloat dx = GLfloat(event->x() - lastPos.x()) / width(); GLfloat dy = GLfloat(event->y() - lastPos.y()) / height(); if (event->buttons() & Qt::LeftButton) { xRot+= 180 * dy; yRot += 180 * dx; update(); } else if (event->buttons() & Qt::RightButton) { xRot += 180 * dy; yRot += 180 * dx; update(); } lastPos = event->pos(); } void QGLScreen::mouseReleaseEvent(QMouseEvent * /* event */) { emit clicked(); }

    Read the article

  • Ajax Control Toolkit July 2011 Release and the New HTML Editor Extender

    - by Stephen Walther
    I’m happy to announce the July 2011 release of the Ajax Control Toolkit which includes important bug fixes and a completely new HTML Editor Extender control. You can download the July 2011 Release by visiting the Ajax Control Toolkit CodePlex site at: http://AjaxControlToolkit.CodePlex.com Using the New HTML Editor Extender Control You can use the new HTML Editor Extender to extend any standard ASP.NET TextBox control so that it supports rich formatting such as bold, italics, bulleted lists, numbered lists, typefaces and different foreground and background colors. The following code illustrates how you can extend a standard ASP.NET TextBox control with the HtmlEditorExtender: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Simple.aspx.cs" Inherits="WebApplication1.Simple" %> <%@ Register TagPrefix="asp" Namespace="AjaxControlToolkit" Assembly="AjaxControlToolkit" %> <html xmlns="http://www.w3.org/1999/xhtml"> <head runat="server"> <title>Simple</title> </head> <body> <form id="form1" runat="server"> <asp:ToolkitScriptManager runat="Server" /> <asp:TextBox ID="txtComments" TextMode="MultiLine" Columns="60" Rows="8" runat="server" /> <asp:HtmlEditorExtender TargetControlID="txtComments" runat="server" /> </form> </body> </html> This page has the following three controls: ToolkitScriptManager – The ToolkitScriptManager renders all of the scripts required by the Ajax Control Toolkit. TextBox – The TextBox control is a standard ASP.NET TextBox which is set to display multiple lines (a TextArea instead of an Input element). HtmlEditorExtender – The HtmlEditorExtender is set to extend the TextBox control. You can use the standard TextBox Text property to read the rich text entered into the TextBox control on the server. Lightweight and HTML5 The HTML Editor Extender works on all modern browsers including the most recent versions of Mozilla Firefox (Firefox 5), Google Chrome (Chrome 12), and Apple Safari (Safari 5). Furthermore, the HTML Editor Extender is compatible with Microsoft Internet Explorer 6 and newer. The HTML Editor Extender is very lightweight. It takes advantage of the HTML5 ContentEditable attribute so it does not require an iframe or complex browser workarounds. If you select View Source in your browser while using the HTML Editor Extender, we hope that you will be pleasantly surprised by how little markup and script is generated by the HTML Editor Extender. Customizable Toolbar Buttons Depending on the web application that you are building, you will want to display different toolbar buttons with the HTML Editor Extender. One of the design goals of the HTML Editor Extender was to make it very easy for you to customize the toolbar buttons. Imagine, for example, that you want to use the HTML Editor Extender when accepting comments on blog posts. In that case, you might want to restrict the type of formatting that a user can display. You might want to enable a user to format text as bold or italic but you do not want the user to make any other formatting changes. The following page illustrates how you can customize the HTML Editor Extender toolbar: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="CustomToolbar.aspx.cs" Inherits="WebApplication1.CustomToolbar" %> <%@ Register TagPrefix="asp" Namespace="AjaxControlToolkit" Assembly="AjaxControlToolkit" %> <html> <head runat="server"> <title>Custom Toolbar</title> </head> <body> <form id="form1" runat="server"> <asp:ToolkitScriptManager Runat="server" /> <asp:TextBox ID="txtComments" TextMode="MultiLine" Columns="50" Rows="10" Text="Hello <b>world!</b>" Runat="server" /> <asp:HtmlEditorExtender TargetControlID="txtComments" runat="server"> <Toolbar> <asp:Bold /> <asp:Italic /> </Toolbar> </asp:HtmlEditorExtender> </form> </body> </html> Notice that the HTML Editor Extender in the page above has a Toolbar subtag. You can list the toolbar buttons which you want to appear within the subtag. In the case above, only Bold and Italic buttons are displayed. Here is a complete list of the Toolbar buttons currently supported by the HTML Editor Extender: Undo Redo Bold Italic Underline StrikeThrough Subscript Superscript JustifyLeft JustifyCenter JustifyRight JustifyFull InsertOrderedList InsertUnorderedList CreateLink UnLink RemoveFormat SelectAll UnSelect Delete Cut Copy Paste BackgroundColorSelector ForeColorSelector FontNameSelector FontSizeSelector Indent Outdent InsertHorizontalRule HorizontalSeparator Of course the HTML Editor Extender was designed to be extensible. You can create your own buttons and add them to the control. Compatible with the AntiXSS Library When using the HTML Editor Extender on a public facing website, we strongly recommend that you use the HTML Editor Extender with the AntiXSS Library. If you allow users to submit arbitrary HTML, and you don’t take any action to strip out malicious markup, then you are opening your website to Cross-Site Scripting Attacks (XSS attacks). The HTML Editor Extender uses the Provider Model to support different Sanitizer Providers. The July 2011 release of the Ajax Control Toolkit ships with a single Sanitizer Provider which uses the AntiXSS library (see http://AntiXss.CodePlex.com ). A Sanitizer Provider is responsible for sanitizing HTML markup by removing any malicious elements, attributes, and attribute values. For example, the AntiXss Sanitizer Provider will take the following block of HTML: <b><a href=""javascript:doEvil()"">Visit Grandma</a></b> <script>doEvil()</script> And return the following sanitized block of HTML: <b><a href="">Visit Grandma</a></b> Notice that the JavaScript href and <SCRIPT> tag are both stripped out. Be aware that there are a depressingly large number of ways to sneak evil markup into your HTML. You definitely want a Sanitizer as a safety net. Before you can use the AntiXSS Sanitizer Provider, you must add three assemblies to your web application: AntiXSSLibrary.dll, HtmlSanitizationLibrary.dll, and SanitizerProviders.dll. All three assemblies are included with the CodePlex download of the Ajax Control Toolkit in the SanitizerProviders folder. Here’s how you modify your web.config file to use the AntiXSS Sanitizer Provider: <configuration> <configSections> <sectionGroup name="system.web"> <section name="sanitizer" requirePermission="false" type="AjaxControlToolkit.Sanitizer.ProviderSanitizerSection, AjaxControlToolkit"/> </sectionGroup> </configSections> <system.web> <compilation targetFramework="4.0" debug="true"/> <sanitizer defaultProvider="AntiXssSanitizerProvider"> <providers> <add name="AntiXssSanitizerProvider" type="AjaxControlToolkit.Sanitizer.AntiXssSanitizerProvider"></add> </providers> </sanitizer> </system.web> </configuration> You can detect whether the HTML Editor Extender is using the AntiXSS Sanitizer Provider by checking the HtmlEditorExtender SanitizerProvider property like this: if (MyHtmlEditorExtender.SanitizerProvider == null) { throw new Exception("Please enable the AntiXss Sanitizer!"); } When the SanitizerProvider property has the value null, you know that a Sanitizer Provider has not been configured in the web.config file. Because the AntiXSS library requires Full Trust, you cannot use the AntiXSS Sanitizer Provider with most shared website hosting providers. Because most shared hosting providers only support Medium Trust and not Full Trust, we do not recommend using the HTML Editor Extender with a public website hosted with a shared hosting provider. Why a New HTML Editor Control? The Ajax Control Toolkit now includes two HTML Editor controls. Why did we introduce a new HTML Editor control when there was already an existing HTML Editor? We think you will like the new HTML Editor much more than the previous one. We had several goals with the new HTML Editor Extender: Lightweight – We wanted to leverage HTML5 to create a lightweight HTML Editor. The new HTML Editor generates much less markup and script than the previous HTML Editor. Secure – We wanted to make it easy to integrate the AntiXSS library with the HTML Editor. If you are creating a public facing website, we strongly recommend that you use the AntiXSS Provider. Customizable – We wanted to make it easy for users to customize the toolbar buttons displayed by the HTML Editor. Compatibility – We wanted to ensure that the HTML Editor will work with the latest versions of the most popular browsers (including Internet Explorer 6 and higher). The old HTML Editor control is still included in the Ajax Control Toolkit and continues to live in the AjaxControlToolkit.HTMLEditor namespace. We have not modified the control and you can continue to use the control in the same way as you have used it in the past. However, we hope that you will consider migrating to the new HTML Editor Extender for the reasons listed above. Summary We’ve introduced a new Ajax Control Toolkit control with this release. I want to thank the developers and testers on the Superexpert team for the huge amount of work which they put into this control. It was a non-trivial task to build an entirely new control which has the complexity of the HTML Editor in less than 6 weeks. Please let us know what you think! We want to hear your feedback. If you discover issues with the new HTML Editor Extender control, or you have questions about the control, or you have ideas for how it can be improved, then please post them to this blog. Tomorrow starts a new sprint

    Read the article

  • Sharepoint 2007: author.dll status code?

    - by CrazyNick
    Is there a way to find any info using /_vti_bin/_vti_aut /author.dll status code? <html><head><title>vermeer RPC packet</title></head> <body> <p>method= <p>status= <ul> <li>status=393226 <li>osstatus=0 <li>msg=The form submission cannot be processed because it exceeded the maximum length allowed by the Web administrator. Please resubmit the form with less data. <li>osmsg= </ul> </body> </html>

    Read the article

  • Employee Info Starter Kit - Visual Studio 2010 and .NET 4.0 Version (4.0.0) Available

    - by joycsharp
    Employee Info Starter Kit is a ASP.NET based web application, which includes very simple user requirements, where we can create, read, update and delete (crud) the employee info of a company. Based on just a database table, it explores and solves all major problems in web development architectural space.  This open source starter kit extensively uses major features available in latest Visual Studio, ASP.NET and Sql Server to make robust, scalable, secured and maintanable web applications quickly and easily. Since it's first release, this starter kit achieved a huge popularity in web developer community and includes 1,40,000+ download from project web site. Visual Studio 2010 and .NET 4.0 came up with lots of exciting features to make software developers life easier.  A new version (v4.0.0) of Employee Info Starter Kit is now available in both MSDN Code Gallery and CodePlex. Chckout the latest version of this starter kit to enjoy cool features available in Visual Studio 2010 and .NET 4.0. [ Release Notes ] Architectural Overview Simple 2 layer architecture (user interface and data access layer) with 1 optional cache layer ASP.NET Web Form based user interface Custom Entity Data Container implemented (with primitive C# types for data fields) Active Record Design Pattern based Data Access Layer, implemented in C# and Entity Framework 4.0 Sql Server Stored Procedure to perform actual CRUD operation Standard infrastructure (architecture, helper utility) for automated integration (bottom up manner) and unit testing Technology UtilizedProgramming Languages/Scripts Browser side: JavaScript Web server side: C# 4.0 Database server side: T-SQL .NET Framework Components .NET 4.0 Entity Framework .NET 4.0 Optional/Named Parameters .NET 4.0 Tuple .NET 3.0+ Extension Method .NET 3.0+ Lambda Expressions .NET 3.0+ Aanonymous Type .NET 3.0+ Query Expressions .NET 3.0+ Automatically Implemented Properties .NET 3.0+ LINQ .NET 2.0 + Partial Classes .NET 2.0 + Generic Type .NET 2.0 + Nullable Type   ASP.NET 3.5+ List View (TBD) ASP.NET 3.5+ Data Pager (TBD) ASP.NET 2.0+ Grid View ASP.NET 2.0+ Form View ASP.NET 2.0+ Skin ASP.NET 2.0+ Theme ASP.NET 2.0+ Master Page ASP.NET 2.0+ Object Data Source ASP.NET 1.0+ Role Based Security Visual Studio Features Visual Studio 2010 CodedUI Test Visual Studio 2010 Layer Diagram Visual Studio 2010 Sequence Diagram Visual Studio 2010 Directed Graph Visual Studio 2005+ Database Unit Test Visual Studio 2005+ Unit Test Visual Studio 2005+ Web Test Visual Studio 2005+ Load Test Sql Server Features Sql Server 2005 Stored Procedure Sql Server 2005 Xml type Sql Server 2005 Paging support

    Read the article

  • Employee Info Starter Kit - Visual Studio 2010 and .NET 4.0 Version (4.0.0) Available

    - by Mohammad Ashraful Alam
    Employee Info Starter Kit is a ASP.NET based web application, which includes very simple user requirements, where we can create, read, update and delete (crud) the employee info of a company. Based on just a database table, it explores and solves most of the major problems in web development architectural space.  This open source starter kit extensively uses major features available in latest Visual Studio, ASP.NET and Sql Server to make robust, scalable, secured and maintanable web applications quickly and easily. Since it's first release, this starter kit achieved a huge popularity in web developer community and includes 1,40,000+ download from project web site. Visual Studio 2010 and .NET 4.0 came up with lots of exciting features to make software developers life easier.  A new version (v4.0.0) of Employee Info Starter Kit is now available in both MSDN Code Gallery and CodePlex. Chckout the latest version of this starter kit to enjoy cool features available in Visual Studio 2010 and .NET 4.0. [ Release Notes ] Architectural Overview Simple 2 layer architecture (user interface and data access layer) with 1 optional cache layer ASP.NET Web Form based user interface Custom Entity Data Container implemented (with primitive C# types for data fields) Active Record Design Pattern based Data Access Layer, implemented in C# and Entity Framework 4.0 Sql Server Stored Procedure to perform actual CRUD operation Standard infrastructure (architecture, helper utility) for automated integration (bottom up manner) and unit testing Technology UtilizedProgramming Languages/Scripts Browser side: JavaScript Web server side: C# 4.0 Database server side: T-SQL .NET Framework Components .NET 4.0 Entity Framework .NET 4.0 Optional/Named Parameters .NET 4.0 Tuple .NET 3.0+ Extension Method .NET 3.0+ Lambda Expressions .NET 3.0+ Aanonymous Type .NET 3.0+ Query Expressions .NET 3.0+ Automatically Implemented Properties .NET 3.0+ LINQ .NET 2.0 + Partial Classes .NET 2.0 + Generic Type .NET 2.0 + Nullable Type   ASP.NET 3.5+ List View (TBD) ASP.NET 3.5+ Data Pager (TBD) ASP.NET 2.0+ Grid View ASP.NET 2.0+ Form View ASP.NET 2.0+ Skin ASP.NET 2.0+ Theme ASP.NET 2.0+ Master Page ASP.NET 2.0+ Object Data Source ASP.NET 1.0+ Role Based Security Visual Studio Features Visual Studio 2010 CodedUI Test Visual Studio 2010 Layer Diagram Visual Studio 2010 Sequence Diagram Visual Studio 2010 Directed Graph Visual Studio 2005+ Database Unit Test Visual Studio 2005+ Unit Test Visual Studio 2005+ Web Test Visual Studio 2005+ Load Test Sql Server Features Sql Server 2005 Stored Procedure Sql Server 2005 Xml type Sql Server 2005 Paging support

    Read the article

  • Agile Development

    - by James Oloo Onyango
    Alot of literature has and is being written about agile developement and its surrounding philosophies. In my quest to find the best way to express the importance of agile methodologies, i have found Robert C. Martin's "A Satire Of Two Companies" to be both the most concise and thorough! Enjoy the read! Rufus Inc Project Kick Off Your name is Bob. The date is January 3, 2001, and your head still aches from the recent millennial revelry. You are sitting in a conference room with several managers and a group of your peers. You are a project team leader. Your boss is there, and he has brought along all of his team leaders. His boss called the meeting. "We have a new project to develop," says your boss's boss. Call him BB. The points in his hair are so long that they scrape the ceiling. Your boss's points are just starting to grow, but he eagerly awaits the day when he can leave Brylcream stains on the acoustic tiles. BB describes the essence of the new market they have identified and the product they want to develop to exploit this market. "We must have this new project up and working by fourth quarter October 1," BB demands. "Nothing is of higher priority, so we are cancelling your current project." The reaction in the room is stunned silence. Months of work are simply going to be thrown away. Slowly, a murmur of objection begins to circulate around the conference table.   His points give off an evil green glow as BB meets the eyes of everyone in the room. One by one, that insidious stare reduces each attendee to quivering lumps of protoplasm. It is clear that he will brook no discussion on this matter. Once silence has been restored, BB says, "We need to begin immediately. How long will it take you to do the analysis?" You raise your hand. Your boss tries to stop you, but his spitwad misses you and you are unaware of his efforts.   "Sir, we can't tell you how long the analysis will take until we have some requirements." "The requirements document won't be ready for 3 or 4 weeks," BB says, his points vibrating with frustration. "So, pretend that you have the requirements in front of you now. How long will you require for analysis?" No one breathes. Everyone looks around to see whether anyone has some idea. "If analysis goes beyond April 1, we have a problem. Can you finish the analysis by then?" Your boss visibly gathers his courage: "We'll find a way, sir!" His points grow 3 mm, and your headache increases by two Tylenol. "Good." BB smiles. "Now, how long will it take to do the design?" "Sir," you say. Your boss visibly pales. He is clearly worried that his 3 mms are at risk. "Without an analysis, it will not be possible to tell you how long design will take." BB's expression shifts beyond austere.   "PRETEND you have the analysis already!" he says, while fixing you with his vacant, beady little eyes. "How long will it take you to do the design?" Two Tylenol are not going to cut it. Your boss, in a desperate attempt to save his new growth, babbles: "Well, sir, with only six months left to complete the project, design had better take no longer than 3 months."   "I'm glad you agree, Smithers!" BB says, beaming. Your boss relaxes. He knows his points are secure. After a while, he starts lightly humming the Brylcream jingle. BB continues, "So, analysis will be complete by April 1, design will be complete by July 1, and that gives you 3 months to implement the project. This meeting is an example of how well our new consensus and empowerment policies are working. Now, get out there and start working. I'll expect to see TQM plans and QIT assignments on my desk by next week. Oh, and don't forget that your crossfunctional team meetings and reports will be needed for next month's quality audit." "Forget the Tylenol," you think to yourself as you return to your cubicle. "I need bourbon."   Visibly excited, your boss comes over to you and says, "Gosh, what a great meeting. I think we're really going to do some world shaking with this project." You nod in agreement, too disgusted to do anything else. "Oh," your boss continues, "I almost forgot." He hands you a 30-page document. "Remember that the SEI is coming to do an evaluation next week. This is the evaluation guide. You need to read through it, memorize it, and then shred it. It tells you how to answer any questions that the SEI auditors ask you. It also tells you what parts of the building you are allowed to take them to and what parts to avoid. We are determined to be a CMM level 3 organization by June!"   You and your peers start working on the analysis of the new project. This is difficult because you have no requirements. But from the 10-minute introduction given by BB on that fateful morning, you have some idea of what the product is supposed to do.   Corporate process demands that you begin by creating a use case document. You and your team begin enumerating use cases and drawing oval and stick diagrams. Philosophical debates break out among the team members. There is disagreement as to whether certain use cases should be connected with <<extends>> or <<includes>> relationships. Competing models are created, but nobody knows how to evaluate them. The debate continues, effectively paralyzing progress.   After a week, somebody finds the iceberg.com Web site, which recommends disposing entirely of <<extends>> and <<includes>> and replacing them with <<precedes>> and <<uses>>. The documents on this Web site, authored by Don Sengroiux, describes a method known as stalwart-analysis, which claims to be a step-by-step method for translating use cases into design diagrams. More competing use case models are created using this new scheme, but again, people can't agree on how to evaluate them. The thrashing continues. More and more, the use case meetings are driven by emotion rather than by reason. If it weren't for the fact that you don't have requirements, you'd be pretty upset by the lack of progress you are making. The requirements document arrives on February 15. And then again on February 20, 25, and every week thereafter. Each new version contradicts the previous one. Clearly, the marketing folks who are writing the requirements, empowered though they might be, are not finding consensus.   At the same time, several new competing use case templates have been proposed by the various team members. Each template presents its own particularly creative way of delaying progress. The debates rage on. On March 1, Prudence Putrigence, the process proctor, succeeds in integrating all the competing use case forms and templates into a single, all-encompassing form. Just the blank form is 15 pages long. She has managed to include every field that appeared on all the competing templates. She also presents a 159- page document describing how to fill out the use case form. All current use cases must be rewritten according to the new standard.   You marvel to yourself that it now requires 15 pages of fill-in-the-blank and essay questions to answer the question: What should the system do when the user presses Return? The corporate process (authored by L. E. Ott, famed author of "Holistic Analysis: A Progressive Dialectic for Software Engineers") insists that you discover all primary use cases, 87 percent of all secondary use cases, and 36.274 percent of all tertiary use cases before you can complete analysis and enter the design phase. You have no idea what a tertiary use case is. So in an attempt to meet this requirement, you try to get your use case document reviewed by the marketing department, which you hope will know what a tertiary use case is.   Unfortunately, the marketing folks are too busy with sales support to talk to you. Indeed, since the project started, you have not been able to get a single meeting with marketing, which has provided a never-ending stream of changing and contradictory requirements documents.   While one team has been spinning endlessly on the use case document, another team has been working out the domain model. Endless variations of UML documents are pouring out of this team. Every week, the model is reworked.   The team members can't decide whether to use <<interfaces>> or <<types>> in the model. A huge disagreement has been raging on the proper syntax and application of OCL. Others on the team just got back from a 5-day class on catabolism, and have been producing incredibly detailed and arcane diagrams that nobody else can fathom.   On March 27, with one week to go before analysis is to be complete, you have produced a sea of documents and diagrams but are no closer to a cogent analysis of the problem than you were on January 3. **** And then, a miracle happens.   **** On Saturday, April 1, you check your e-mail from home. You see a memo from your boss to BB. It states unequivocally that you are done with the analysis! You phone your boss and complain. "How could you have told BB that we were done with the analysis?" "Have you looked at a calendar lately?" he responds. "It's April 1!" The irony of that date does not escape you. "But we have so much more to think about. So much more to analyze! We haven't even decided whether to use <<extends>> or <<precedes>>!" "Where is your evidence that you are not done?" inquires your boss, impatiently. "Whaaa . . . ." But he cuts you off. "Analysis can go on forever; it has to be stopped at some point. And since this is the date it was scheduled to stop, it has been stopped. Now, on Monday, I want you to gather up all existing analysis materials and put them into a public folder. Release that folder to Prudence so that she can log it in the CM system by Monday afternoon. Then get busy and start designing."   As you hang up the phone, you begin to consider the benefits of keeping a bottle of bourbon in your bottom desk drawer. They threw a party to celebrate the on-time completion of the analysis phase. BB gave a colon-stirring speech on empowerment. And your boss, another 3 mm taller, congratulated his team on the incredible show of unity and teamwork. Finally, the CIO takes the stage to tell everyone that the SEI audit went very well and to thank everyone for studying and shredding the evaluation guides that were passed out. Level 3 now seems assured and will be awarded by June. (Scuttlebutt has it that managers at the level of BB and above are to receive significant bonuses once the SEI awards level 3.)   As the weeks flow by, you and your team work on the design of the system. Of course, you find that the analysis that the design is supposedly based on is flawedno, useless; no, worse than useless. But when you tell your boss that you need to go back and work some more on the analysis to shore up its weaker sections, he simply states, "The analysis phase is over. The only allowable activity is design. Now get back to it."   So, you and your team hack the design as best you can, unsure of whether the requirements have been properly analyzed. Of course, it really doesn't matter much, since the requirements document is still thrashing with weekly revisions, and the marketing department still refuses to meet with you.     The design is a nightmare. Your boss recently misread a book named The Finish Line in which the author, Mark DeThomaso, blithely suggested that design documents should be taken down to code-level detail. "If we are going to be working at that level of detail," you ask, "why don't we simply write the code instead?" "Because then you wouldn't be designing, of course. And the only allowable activity in the design phase is design!" "Besides," he continues, "we have just purchased a companywide license for Dandelion! This tool enables 'Round the Horn Engineering!' You are to transfer all design diagrams into this tool. It will automatically generate our code for us! It will also keep the design diagrams in sync with the code!" Your boss hands you a brightly colored shrinkwrapped box containing the Dandelion distribution. You accept it numbly and shuffle off to your cubicle. Twelve hours, eight crashes, one disk reformatting, and eight shots of 151 later, you finally have the tool installed on your server. You consider the week your team will lose while attending Dandelion training. Then you smile and think, "Any week I'm not here is a good week." Design diagram after design diagram is created by your team. Dandelion makes it very difficult to draw these diagrams. There are dozens and dozens of deeply nested dialog boxes with funny text fields and check boxes that must all be filled in correctly. And then there's the problem of moving classes between packages. At first, these diagram are driven from the use cases. But the requirements are changing so often that the use cases rapidly become meaningless. Debates rage about whether VISITOR or DECORATOR design patterns should be used. One developer refuses to use VISITOR in any form, claiming that it's not a properly object-oriented construct. Someone refuses to use multiple inheritance, since it is the spawn of the devil. Review meetings rapidly degenerate into debates about the meaning of object orientation, the definition of analysis versus design, or when to use aggregation versus association. Midway through the design cycle, the marketing folks announce that they have rethought the focus of the system. Their new requirements document is completely restructured. They have eliminated several major feature areas and replaced them with feature areas that they anticipate customer surveys will show to be more appropriate. You tell your boss that these changes mean that you need to reanalyze and redesign much of the system. But he says, "The analysis phase is system. But he says, "The analysis phase is over. The only allowable activity is design. Now get back to it."   You suggest that it might be better to create a simple prototype to show to the marketing folks and even some potential customers. But your boss says, "The analysis phase is over. The only allowable activity is design. Now get back to it." Hack, hack, hack, hack. You try to create some kind of a design document that might reflect the new requirements documents. However, the revolution of the requirements has not caused them to stop thrashing. Indeed, if anything, the wild oscillations of the requirements document have only increased in frequency and amplitude.   You slog your way through them.   On June 15, the Dandelion database gets corrupted. Apparently, the corruption has been progressive. Small errors in the DB accumulated over the months into bigger and bigger errors. Eventually, the CASE tool just stopped working. Of course, the slowly encroaching corruption is present on all the backups. Calls to the Dandelion technical support line go unanswered for several days. Finally, you receive a brief e-mail from Dandelion, informing you that this is a known problem and that the solution is to purchase the new version, which they promise will be ready some time next quarter, and then reenter all the diagrams by hand.   ****   Then, on July 1 another miracle happens! You are done with the design!   Rather than go to your boss and complain, you stock your middle desk drawer with some vodka.   **** They threw a party to celebrate the on-time completion of the design phase and their graduation to CMM level 3. This time, you find BB's speech so stirring that you have to use the restroom before it begins. New banners and plaques are all over your workplace. They show pictures of eagles and mountain climbers, and they talk about teamwork and empowerment. They read better after a few scotches. That reminds you that you need to clear out your file cabinet to make room for the brandy. You and your team begin to code. But you rapidly discover that the design is lacking in some significant areas. Actually, it's lacking any significance at all. You convene a design session in one of the conference rooms to try to work through some of the nastier problems. But your boss catches you at it and disbands the meeting, saying, "The design phase is over. The only allowable activity is coding. Now get back to it."   ****   The code generated by Dandelion is really hideous. It turns out that you and your team were using association and aggregation the wrong way, after all. All the generated code has to be edited to correct these flaws. Editing this code is extremely difficult because it has been instrumented with ugly comment blocks that have special syntax that Dandelion needs in order to keep the diagrams in sync with the code. If you accidentally alter one of these comments, the diagrams will be regenerated incorrectly. It turns out that "Round the Horn Engineering" requires an awful lot of effort. The more you try to keep the code compatible with Dandelion, the more errors Dandelion generates. In the end, you give up and decide to keep the diagrams up to date manually. A second later, you decide that there's no point in keeping the diagrams up to date at all. Besides, who has time?   Your boss hires a consultant to build tools to count the number of lines of code that are being produced. He puts a big thermometer graph on the wall with the number 1,000,000 on the top. Every day, he extends the red line to show how many lines have been added. Three days after the thermometer appears on the wall, your boss stops you in the hall. "That graph isn't growing quickly enough. We need to have a million lines done by October 1." "We aren't even sh-sh-sure that the proshect will require a m-million linezh," you blather. "We have to have a million lines done by October 1," your boss reiterates. His points have grown again, and the Grecian formula he uses on them creates an aura of authority and competence. "Are you sure your comment blocks are big enough?" Then, in a flash of managerial insight, he says, "I have it! I want you to institute a new policy among the engineers. No line of code is to be longer than 20 characters. Any such line must be split into two or more preferably more. All existing code needs to be reworked to this standard. That'll get our line count up!"   You decide not to tell him that this will require two unscheduled work months. You decide not to tell him anything at all. You decide that intravenous injections of pure ethanol are the only solution. You make the appropriate arrangements. Hack, hack, hack, and hack. You and your team madly code away. By August 1, your boss, frowning at the thermometer on the wall, institutes a mandatory 50-hour workweek.   Hack, hack, hack, and hack. By September 1st, the thermometer is at 1.2 million lines and your boss asks you to write a report describing why you exceeded the coding budget by 20 percent. He institutes mandatory Saturdays and demands that the project be brought back down to a million lines. You start a campaign of remerging lines. Hack, hack, hack, and hack. Tempers are flaring; people are quitting; QA is raining trouble reports down on you. Customers are demanding installation and user manuals; salespeople are demanding advance demonstrations for special customers; the requirements document is still thrashing, the marketing folks are complaining that the product isn't anything like they specified, and the liquor store won't accept your credit card anymore. Something has to give.    On September 15, BB calls a meeting. As he enters the room, his points are emitting clouds of steam. When he speaks, the bass overtones of his carefully manicured voice cause the pit of your stomach to roll over. "The QA manager has told me that this project has less than 50 percent of the required features implemented. He has also informed me that the system crashes all the time, yields wrong results, and is hideously slow. He has also complained that he cannot keep up with the continuous train of daily releases, each more buggy than the last!" He stops for a few seconds, visibly trying to compose himself. "The QA manager estimates that, at this rate of development, we won't be able to ship the product until December!" Actually, you think it's more like March, but you don't say anything. "December!" BB roars with such derision that people duck their heads as though he were pointing an assault rifle at them. "December is absolutely out of the question. Team leaders, I want new estimates on my desk in the morning. I am hereby mandating 65-hour work weeks until this project is complete. And it better be complete by November 1."   As he leaves the conference room, he is heard to mutter: "Empowermentbah!" * * * Your boss is bald; his points are mounted on BB's wall. The fluorescent lights reflecting off his pate momentarily dazzle you. "Do you have anything to drink?" he asks. Having just finished your last bottle of Boone's Farm, you pull a bottle of Thunderbird from your bookshelf and pour it into his coffee mug. "What's it going to take to get this project done? " he asks. "We need to freeze the requirements, analyze them, design them, and then implement them," you say callously. "By November 1?" your boss exclaims incredulously. "No way! Just get back to coding the damned thing." He storms out, scratching his vacant head.   A few days later, you find that your boss has been transferred to the corporate research division. Turnover has skyrocketed. Customers, informed at the last minute that their orders cannot be fulfilled on time, have begun to cancel their orders. Marketing is re-evaluating whether this product aligns with the overall goals of the company. Memos fly, heads roll, policies change, and things are, overall, pretty grim. Finally, by March, after far too many sixty-five hour weeks, a very shaky version of the software is ready. In the field, bug-discovery rates are high, and the technical support staff are at their wits' end, trying to cope with the complaints and demands of the irate customers. Nobody is happy.   In April, BB decides to buy his way out of the problem by licensing a product produced by Rupert Industries and redistributing it. The customers are mollified, the marketing folks are smug, and you are laid off.     Rupert Industries: Project Alpha   Your name is Robert. The date is January 3, 2001. The quiet hours spent with your family this holiday have left you refreshed and ready for work. You are sitting in a conference room with your team of professionals. The manager of the division called the meeting. "We have some ideas for a new project," says the division manager. Call him Russ. He is a high-strung British chap with more energy than a fusion reactor. He is ambitious and driven but understands the value of a team. Russ describes the essence of the new market opportunity the company has identified and introduces you to Jane, the marketing manager, who is responsible for defining the products that will address it. Addressing you, Jane says, "We'd like to start defining our first product offering as soon as possible. When can you and your team meet with me?" You reply, "We'll be done with the current iteration of our project this Friday. We can spare a few hours for you between now and then. After that, we'll take a few people from the team and dedicate them to you. We'll begin hiring their replacements and the new people for your team immediately." "Great," says Russ, "but I want you to understand that it is critical that we have something to exhibit at the trade show coming up this July. If we can't be there with something significant, we'll lose the opportunity."   "I understand," you reply. "I don't yet know what it is that you have in mind, but I'm sure we can have something by July. I just can't tell you what that something will be right now. In any case, you and Jane are going to have complete control over what we developers do, so you can rest assured that by July, you'll have the most important things that can be accomplished in that time ready to exhibit."   Russ nods in satisfaction. He knows how this works. Your team has always kept him advised and allowed him to steer their development. He has the utmost confidence that your team will work on the most important things first and will produce a high-quality product.   * * *   "So, Robert," says Jane at their first meeting, "How does your team feel about being split up?" "We'll miss working with each other," you answer, "but some of us were getting pretty tired of that last project and are looking forward to a change. So, what are you people cooking up?" Jane beams. "You know how much trouble our customers currently have . . ." And she spends a half hour or so describing the problem and possible solution. "OK, wait a second" you respond. "I need to be clear about this." And so you and Jane talk about how this system might work. Some of her ideas aren't fully formed. You suggest possible solutions. She likes some of them. You continue discussing.   During the discussion, as each new topic is addressed, Jane writes user story cards. Each card represents something that the new system has to do. The cards accumulate on the table and are spread out in front of you. Both you and Jane point at them, pick them up, and make notes on them as you discuss the stories. The cards are powerful mnemonic devices that you can use to represent complex ideas that are barely formed.   At the end of the meeting, you say, "OK, I've got a general idea of what you want. I'm going to talk to the team about it. I imagine they'll want to run some experiments with various database structures and presentation formats. Next time we meet, it'll be as a group, and we'll start identifying the most important features of the system."   A week later, your nascent team meets with Jane. They spread the existing user story cards out on the table and begin to get into some of the details of the system. The meeting is very dynamic. Jane presents the stories in the order of their importance. There is much discussion about each one. The developers are concerned about keeping the stories small enough to estimate and test. So they continually ask Jane to split one story into several smaller stories. Jane is concerned that each story have a clear business value and priority, so as she splits them, she makes sure that this stays true.   The stories accumulate on the table. Jane writes them, but the developers make notes on them as needed. Nobody tries to capture everything that is said; the cards are not meant to capture everything but are simply reminders of the conversation.   As the developers become more comfortable with the stories, they begin writing estimates on them. These estimates are crude and budgetary, but they give Jane an idea of what the story will cost.   At the end of the meeting, it is clear that many more stories could be discussed. It is also clear that the most important stories have been addressed and that they represent several months worth of work. Jane closes the meeting by taking the cards with her and promising to have a proposal for the first release in the morning.   * * *   The next morning, you reconvene the meeting. Jane chooses five cards and places them on the table. "According to your estimates, these cards represent about one perfect team-week's worth of work. The last iteration of the previous project managed to get one perfect team-week done in 3 real weeks. If we can get these five stories done in 3 weeks, we'll be able to demonstrate them to Russ. That will make him feel very comfortable about our progress." Jane is pushing it. The sheepish look on her face lets you know that she knows it too. You reply, "Jane, this is a new team, working on a new project. It's a bit presumptuous to expect that our velocity will be the same as the previous team's. However, I met with the team yesterday afternoon, and we all agreed that our initial velocity should, in fact, be set to one perfectweek for every 3 real-weeks. So you've lucked out on this one." "Just remember," you continue, "that the story estimates and the story velocity are very tentative at this point. We'll learn more when we plan the iteration and even more when we implement it."   Jane looks over her glasses at you as if to say "Who's the boss around here, anyway?" and then smiles and says, "Yeah, don't worry. I know the drill by now."Jane then puts 15 more cards on the table. She says, "If we can get all these cards done by the end of March, we can turn the system over to our beta test customers. And we'll get good feedback from them."   You reply, "OK, so we've got our first iteration defined, and we have the stories for the next three iterations after that. These four iterations will make our first release."   "So," says Jane, can you really do these five stories in the next 3 weeks?" "I don't know for sure, Jane," you reply. "Let's break them down into tasks and see what we get."   So Jane, you, and your team spend the next several hours taking each of the five stories that Jane chose for the first iteration and breaking them down into small tasks. The developers quickly realize that some of the tasks can be shared between stories and that other tasks have commonalities that can probably be taken advantage of. It is clear that potential designs are popping into the developers' heads. From time to time, they form little discussion knots and scribble UML diagrams on some cards.   Soon, the whiteboard is filled with the tasks that, once completed, will implement the five stories for this iteration. You start the sign-up process by saying, "OK, let's sign up for these tasks." "I'll take the initial database generation." Says Pete. "That's what I did on the last project, and this doesn't look very different. I estimate it at two of my perfect workdays." "OK, well, then, I'll take the login screen," says Joe. "Aw, darn," says Elaine, the junior member of the team, "I've never done a GUI, and kinda wanted to try that one."   "Ah, the impatience of youth," Joe says sagely, with a wink in your direction. "You can assist me with it, young Jedi." To Jane: "I think it'll take me about three of my perfect workdays."   One by one, the developers sign up for tasks and estimate them in terms of their own perfect workdays. Both you and Jane know that it is best to let the developers volunteer for tasks than to assign the tasks to them. You also know full well that you daren't challenge any of the developers' estimates. You know these people, and you trust them. You know that they are going to do the very best they can.   The developers know that they can't sign up for more perfect workdays than they finished in the last iteration they worked on. Once each developer has filled his or her schedule for the iteration, they stop signing up for tasks.   Eventually, all the developers have stopped signing up for tasks. But, of course, tasks are still left on the board.   "I was worried that that might happen," you say, "OK, there's only one thing to do, Jane. We've got too much to do in this iteration. What stories or tasks can we remove?" Jane sighs. She knows that this is the only option. Working overtime at the beginning of a project is insane, and projects where she's tried it have not fared well.   So Jane starts to remove the least-important functionality. "Well, we really don't need the login screen just yet. We can simply start the system in the logged-in state." "Rats!" cries Elaine. "I really wanted to do that." "Patience, grasshopper." says Joe. "Those who wait for the bees to leave the hive will not have lips too swollen to relish the honey." Elaine looks confused. Everyone looks confused. "So . . .," Jane continues, "I think we can also do away with . . ." And so, bit by bit, the list of tasks shrinks. Developers who lose a task sign up for one of the remaining ones.   The negotiation is not painless. Several times, Jane exhibits obvious frustration and impatience. Once, when tensions are especially high, Elaine volunteers, "I'll work extra hard to make up some of the missing time." You are about to correct her when, fortunately, Joe looks her in the eye and says, "When once you proceed down the dark path, forever will it dominate your destiny."   In the end, an iteration acceptable to Jane is reached. It's not what Jane wanted. Indeed, it is significantly less. But it's something the team feels that can be achieved in the next 3 weeks.   And, after all, it still addresses the most important things that Jane wanted in the iteration. "So, Jane," you say when things had quieted down a bit, "when can we expect acceptance tests from you?" Jane sighs. This is the other side of the coin. For every story the development team implements,   Jane must supply a suite of acceptance tests that prove that it works. And the team needs these long before the end of the iteration, since they will certainly point out differences in the way Jane and the developers imagine the system's behaviour.   "I'll get you some example test scripts today," Jane promises. "I'll add to them every day after that. You'll have the entire suite by the middle of the iteration."   * * *   The iteration begins on Monday morning with a flurry of Class, Responsibilities, Collaborators sessions. By midmorning, all the developers have assembled into pairs and are rapidly coding away. "And now, my young apprentice," Joe says to Elaine, "you shall learn the mysteries of test-first design!"   "Wow, that sounds pretty rad," Elaine replies. "How do you do it?" Joe beams. It's clear that he has been anticipating this moment. "OK, what does the code do right now?" "Huh?" replied Elaine, "It doesn't do anything at all; there is no code."   "So, consider our task; can you think of something the code should do?" "Sure," Elaine said with youthful assurance, "First, it should connect to the database." "And thereupon, what must needs be required to connecteth the database?" "You sure talk weird," laughed Elaine. "I think we'd have to get the database object from some registry and call the Connect() method. "Ah, astute young wizard. Thou perceives correctly that we requireth an object within which we can cacheth the database object." "Is 'cacheth' really a word?" "It is when I say it! So, what test can we write that we know the database registry should pass?" Elaine sighs. She knows she'll just have to play along. "We should be able to create a database object and pass it to the registry in a Store() method. And then we should be able to pull it out of the registry with a Get() method and make sure it's the same object." "Oh, well said, my prepubescent sprite!" "Hay!" "So, now, let's write a test function that proves your case." "But shouldn't we write the database object and registry object first?" "Ah, you've much to learn, my young impatient one. Just write the test first." "But it won't even compile!" "Are you sure? What if it did?" "Uh . . ." "Just write the test, Elaine. Trust me." And so Joe, Elaine, and all the other developers began to code their tasks, one test case at a time. The room in which they worked was abuzz with the conversations between the pairs. The murmur was punctuated by an occasional high five when a pair managed to finish a task or a difficult test case.   As development proceeded, the developers changed partners once or twice a day. Each developer got to see what all the others were doing, and so knowledge of the code spread generally throughout the team.   Whenever a pair finished something significant whether a whole task or simply an important part of a task they integrated what they had with the rest of the system. Thus, the code base grew daily, and integration difficulties were minimized.   The developers communicated with Jane on a daily basis. They'd go to her whenever they had a question about the functionality of the system or the interpretation of an acceptance test case.   Jane, good as her word, supplied the team with a steady stream of acceptance test scripts. The team read these carefully and thereby gained a much better understanding of what Jane expected the system to do. By the beginning of the second week, there was enough functionality to demonstrate to Jane. She watched eagerly as the demonstration passed test case after test case. "This is really cool," Jane said as the demonstration finally ended. "But this doesn't seem like one-third of the tasks. Is your velocity slower than anticipated?"   You grimace. You'd been waiting for a good time to mention this to Jane but now she was forcing the issue. "Yes, unfortunately, we are going more slowly than we had expected. The new application server we are using is turning out to be a pain to configure. Also, it takes forever to reboot, and we have to reboot it whenever we make even the slightest change to its configuration."   Jane eyes you with suspicion. The stress of last Monday's negotiations had still not entirely dissipated. She says, "And what does this mean to our schedule? We can't slip it again, we just can't. Russ will have a fit! He'll haul us all into the woodshed and ream us some new ones."   You look Jane right in the eyes. There's no pleasant way to give someone news like this. So you just blurt out, "Look, if things keep going like they're going, we're not going to be done with everything by next Friday. Now it's possible that we'll figure out a way to go faster. But, frankly, I wouldn't depend on that. You should start thinking about one or two tasks that could be removed from the iteration without ruining the demonstration for Russ. Come hell or high water, we are going to give that demonstration on Friday, and I don't think you want us to choose which tasks to omit."   "Aw forchrisakes!" Jane barely manages to stifle yelling that last word as she stalks away, shaking her head. Not for the first time, you say to yourself, "Nobody ever promised me project management would be easy." You are pretty sure it won't be the last time, either.   Actually, things went a bit better than you had hoped. The team did, in fact, have to drop one task from the iteration, but Jane had chosen wisely, and the demonstration for Russ went without a hitch. Russ was not impressed with the progress, but neither was he dismayed. He simply said, "This is pretty good. But remember, we have to be able to demonstrate this system at the trade show in July, and at this rate, it doesn't look like you'll have all that much to show." Jane, whose attitude had improved dramatically with the completion of the iteration, responded to Russ by saying, "Russ, this team is working hard, and well. When July comes around, I am confident that we'll have something significant to demonstrate. It won't be everything, and some of it may be smoke and mirrors, but we'll have something."   Painful though the last iteration was, it had calibrated your velocity numbers. The next iteration went much better. Not because your team got more done than in the last iteration but simply because the team didn't have to remove any tasks or stories in the middle of the iteration.   By the start of the fourth iteration, a natural rhythm has been established. Jane, you, and the team know exactly what to expect from one another. The team is running hard, but the pace is sustainable. You are confident that the team can keep up this pace for a year or more.   The number of surprises in the schedule diminishes to near zero; however, the number of surprises in the requirements does not. Jane and Russ frequently look over the growing system and make recommendations or changes to the existing functionality. But all parties realize that these changes take time and must be scheduled. So the changes do not cause anyone's expectations to be violated. In March, there is a major demonstration of the system to the board of directors. The system is very limited and is not yet in a form good enough to take to the trade show, but progress is steady, and the board is reasonably impressed.   The second release goes even more smoothly than the first. By now, the team has figured out a way to automate Jane's acceptance test scripts. The team has also refactored the design of the system to the point that it is really easy to add new features and change old ones. The second release was done by the end of June and was taken to the trade show. It had less in it than Jane and Russ would have liked, but it did demonstrate the most important features of the system. Although customers at the trade show noticed that certain features were missing, they were very impressed overall. You, Russ, and Jane all returned from the trade show with smiles on your faces. You all felt as though this project was a winner.   Indeed, many months later, you are contacted by Rufus Inc. That company had been working on a system like this for its internal operations. Rufus has canceled the development of that system after a death-march project and is negotiating to license your technology for its environment.   Indeed, things are looking up!

    Read the article

< Previous Page | 363 364 365 366 367 368 369 370 371 372 373 374  | Next Page >