Search Results

Search found 4130 results on 166 pages for 'david grant'.

Page 162/166 | < Previous Page | 158 159 160 161 162 163 164 165 166  | Next Page >

  • Paging Through Records(json data) Using jQuery...

    - by Pandiya Chendur
    I have a JSON result that contains numerous records. I'd like to show the first five records in one page and create pager links which have to move to that page with five record so on. I don't want the page to refresh which is why I'm hoping a combination of JavaScript and jQuery... My json Data looks like this... {"Table" : [{"Emp_Id" : "3","Identity_No" : "","Emp_Name" : "Jerome","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Supervisior","Desig_Description" : "Supervisior of the Construction","SalaryBasis" : "Monthly","FixedSalary" : "25000.00"},{"Emp_Id" : "4","Identity_No" : "","Emp_Name" : "Mohan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Acc ","Desig_Description" : "Accountant","SalaryBasis" : "Monthly","FixedSalary" : "200.00"},{"Emp_Id" : "5","Identity_No" : "","Emp_Name" : "Murugan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "150.00"},{"Emp_Id" : "6","Identity_No" : "","Emp_Name" : "Ram","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "120.00"},{"Emp_Id" : "7","Identity_No" : "","Emp_Name" : "Raja","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "135.00"},{"Emp_Id" : "8","Identity_No" : "","Emp_Name" : "Raja kumar","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "9","Identity_No" : "","Emp_Name" : "Lakshmi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "100.00"},{"Emp_Id" : "10","Identity_No" : "","Emp_Name" : "Palani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "200.00"},{"Emp_Id" : "11","Identity_No" : "","Emp_Name" : "Annamalai","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "12","Identity_No" : "","Emp_Name" : "David","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "13","Identity_No" : "","Emp_Name" : "Chandru","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "14","Identity_No" : "","Emp_Name" : "Mani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Helper","Desig_Description" : "Steel Helper","SalaryBasis" : "Weekly","FixedSalary" : "175.00"},{"Emp_Id" : "15","Identity_No" : "","Emp_Name" : "Karthik","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "195.00"},{"Emp_Id" : "16","Identity_No" : "","Emp_Name" : "Bala","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "17","Identity_No" : "","Emp_Name" : "Tamil arasi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Helper","Desig_Description" : "Wood Helper","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "18","Identity_No" : "","Emp_Name" : "Perumal","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Cook","Desig_Description" : "Cook","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "19","Identity_No" : "","Emp_Name" : "Andiappan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Watchman","Desig_Description" : "Watchman","SalaryBasis" : "Weekly","FixedSalary" : "150.00"}]} And as of now my result looks like this, http://img401.imageshack.us/img401/2500/yuidtsum.jpg I have used jquery for this, var jsonObj = JSON.parse(HfJsonValue); for (var i = jsonObj.Table.length - 1; i >= 0; i--) { var employee = jsonObj.Table[i]; $('<div class="resultsdiv"><br /><span class="resultName">' + employee.Emp_Name + '</span><span class="resultfields" style="padding-left:100px;">Category&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Desig_Name + '</span><br /><br /><span id="SalaryBasis" class="resultfields">Salary Basis&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.SalaryBasis + '</span><span class="resultfields" style="padding-left:25px;">Salary&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.FixedSalary + '</span><span style="font-size:110%;font-weight:bolder;padding-left:25px;">Address&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Address + '</span></div>').insertAfter('#ResultsDiv'); } My image contains only 6 records as of now.. Any suggestions?

    Read the article

  • Good jquery pagination plugin to use with json Data...

    - by bala3569
    I am looking for a good jquery pagination plugin to use in my aspx page.... I have the following parameters currentpage,pagesize,TotalRecords,NumberofPages... I would like my plugin to same as stackoverflow paging .... EDIT: It should paginate through json data.... similar to this I use my json data and iterating with jquery var jsonObj = jQuery.parseJSON(HfJsonValue); for (var i = jsonObj.Table.length - 1; i >= 0; i--) { var employee = jsonObj.Table[i]; $('<div class="resultsdiv"><br /><span class="resultName">' + employee.Emp_Name + '</span><span class="resultfields" style="padding-left:100px;">Category&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Desig_Name + '</span><br /><br /><span id="SalaryBasis" class="resultfields">Salary Basis&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.SalaryBasis + '</span><span class="resultfields" style="padding-left:25px;">Salary&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.FixedSalary + '</span><span style="font-size:110%;font-weight:bolder;padding-left:25px;">Address&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Address + '</span></div>').insertAfter('#ResultsDiv'); } There are 25 divs in my page as a result i want to show first five divs in page 1 and so on... Any suggestion... My HfJsonValue contains the following json data {"Table" : [{"Emp_Id" : "3","Identity_No" : "","Emp_Name" : "Jerome","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Supervisior","Desig_Description" : "Supervisior of the Construction","SalaryBasis" : "Monthly","FixedSalary" : "25000.00"},{"Emp_Id" : "4","Identity_No" : "","Emp_Name" : "Mohan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Acc ","Desig_Description" : "Accountant","SalaryBasis" : "Monthly","FixedSalary" : "200.00"},{"Emp_Id" : "5","Identity_No" : "","Emp_Name" : "Murugan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "150.00"},{"Emp_Id" : "6","Identity_No" : "","Emp_Name" : "Ram","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "120.00"},{"Emp_Id" : "7","Identity_No" : "","Emp_Name" : "Raja","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "135.00"},{"Emp_Id" : "8","Identity_No" : "","Emp_Name" : "Raja kumar","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "9","Identity_No" : "","Emp_Name" : "Lakshmi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "100.00"},{"Emp_Id" : "10","Identity_No" : "","Emp_Name" : "Palani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "200.00"},{"Emp_Id" : "11","Identity_No" : "","Emp_Name" : "Annamalai","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "12","Identity_No" : "","Emp_Name" : "David","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "13","Identity_No" : "","Emp_Name" : "Chandru","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "14","Identity_No" : "","Emp_Name" : "Mani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Helper","Desig_Description" : "Steel Helper","SalaryBasis" : "Weekly","FixedSalary" : "175.00"},{"Emp_Id" : "15","Identity_No" : "","Emp_Name" : "Karthik","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "195.00"},{"Emp_Id" : "16","Identity_No" : "","Emp_Name" : "Bala","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "17","Identity_No" : "","Emp_Name" : "Tamil arasi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Helper","Desig_Description" : "Wood Helper","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "18","Identity_No" : "","Emp_Name" : "Perumal","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Cook","Desig_Description" : "Cook","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "19","Identity_No" : "","Emp_Name" : "Andiappan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Watchman","Desig_Description" : "Watchman","SalaryBasis" : "Weekly","FixedSalary" : "150.00"}]}

    Read the article

  • Simplify CASE in VB.net code

    - by StealthRT
    Hey all, i am looking here to see if anyone would have a better way to acomplish this task below in less code. Select Case mainMenu.theNumOpened Case 1 Me.Text = "NBMsg1" Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 2 Me.Text = "NBMsg2" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 3 Me.Text = "NBMsg3" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 4 Me.Text = "NBMsg4" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg3") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 3) + 20, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 5 Me.Text = "NBMsg5" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg4") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg3") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 3) + 20, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 4) + 25, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 6 Me.Text = "NBMsg6" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg5") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg4") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg3") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 3) + 20, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 4) + 25, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 5) + 30, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case Else Me.Close() Me.Dispose() End Select What it does is pass to it now many windows are currently already opened. So if one then of course it goes to case 1. If there are 2 opened then it moves the oldest down and puts the newest on top. etc etc. I have set it so that a max of 6 boxes can only be opened at one time. If anyone knows how i could also "slide" them down (kinda like an effect of jQuery) then that would also be, well awesome to know! :o) Any help/suggestions would be great! :o) David

    Read the article

  • How to use jquery to paginate json data?

    - by Pandiya Chendur
    My json Data looks like this {"Table" : [{"Emp_Id" : "3","Identity_No" : "","Emp_Name" : "Jerome","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Supervisior","Desig_Description" : "Supervisior of the Construction","SalaryBasis" : "Monthly","FixedSalary" : "25000.00"},{"Emp_Id" : "4","Identity_No" : "","Emp_Name" : "Mohan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Acc ","Desig_Description" : "Accountant","SalaryBasis" : "Monthly","FixedSalary" : "200.00"},{"Emp_Id" : "5","Identity_No" : "","Emp_Name" : "Murugan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "150.00"},{"Emp_Id" : "6","Identity_No" : "","Emp_Name" : "Ram","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "120.00"},{"Emp_Id" : "7","Identity_No" : "","Emp_Name" : "Raja","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "135.00"},{"Emp_Id" : "8","Identity_No" : "","Emp_Name" : "Raja kumar","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "9","Identity_No" : "","Emp_Name" : "Lakshmi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "100.00"},{"Emp_Id" : "10","Identity_No" : "","Emp_Name" : "Palani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "200.00"},{"Emp_Id" : "11","Identity_No" : "","Emp_Name" : "Annamalai","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "12","Identity_No" : "","Emp_Name" : "David","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "13","Identity_No" : "","Emp_Name" : "Chandru","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "14","Identity_No" : "","Emp_Name" : "Mani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Helper","Desig_Description" : "Steel Helper","SalaryBasis" : "Weekly","FixedSalary" : "175.00"},{"Emp_Id" : "15","Identity_No" : "","Emp_Name" : "Karthik","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "195.00"},{"Emp_Id" : "16","Identity_No" : "","Emp_Name" : "Bala","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "17","Identity_No" : "","Emp_Name" : "Tamil arasi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Helper","Desig_Description" : "Wood Helper","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "18","Identity_No" : "","Emp_Name" : "Perumal","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Cook","Desig_Description" : "Cook","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "19","Identity_No" : "","Emp_Name" : "Andiappan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Watchman","Desig_Description" : "Watchman","SalaryBasis" : "Weekly","FixedSalary" : "150.00"}]} There are 22 records in this json... How to paginate this json data 5 per page using jquery? EDIT: The above image is my summary view of employee list iterated using jquery var jsonObj = JSON.parse(HfJsonValue); for (var i = jsonObj.Table.length - 1; i >= 0; i--) { var employee = jsonObj.Table[i]; $('<div class="resultsdiv"><br /><span class="resultName">' + employee.Emp_Name + '</span><span class="resultfields" style="padding-left:100px;">Category&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Desig_Name + '</span><br /><br /><span id="SalaryBasis" class="resultfields">Salary Basis&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.SalaryBasis + '</span><span class="resultfields" style="padding-left:25px;">Salary&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.FixedSalary + '</span><span style="font-size:110%;font-weight:bolder;padding-left:25px;">Address&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Address + '</span></div>').insertAfter('#ResultsDiv'); } I get 22 records now it may grow how to paginate json date by using jquery pagination.. Any suggestion...

    Read the article

  • how to organise my abstraction?

    - by DaveM
    I have a problem that I can't decide how to best handle, so I'm asking for advice. Please bear with me if my question isn't clear, it is possiblybecause I'm not sure exacly how to solve! I have a set of function that I have in a class. These function are a set of lowest commonality. To be able to run this I need to generate certain info. But this info can arrive with my class from one of 2 routes. I'll try to summarise my situation.... Lets say that I have a class as follows. public class onHoliday(){ private Object modeOfTravel; private Object location; public onHoliday(Object vehicle, Location GPScoords) { private boolean haveFun(){//function to have fun, needs 4 people }//end haveFun() } } Lets imagine I can get to my holiday either by car or by bike. my haveFun() function is dependant my type of vehicle. But only loosely. I have another function that determines my vehicle type, and extracts the required values. for example if I send a car I may get 4 people in one go, but if I send I bike I need at least 2 to get the required 4 I have currently 2 options: Overload my constructor, so as I can send either 2 bikes or a single car into it, then I can call one of 2 intermediate functions (to get the names of my 4 people for instance) before I haveFun() - this is what I am currently doing. split the 2 constructors into 2 separate classes, and repeat my haveFun() in a third class, that becomes an object of my 2 other classes. my problem with this is that my intermediate functions are all but a few lines of code, and I don't want to have them in a separate file! (I allways put classes in separate files!) Please note, my haveFun() isn't something that I'm going to need outside of these 2 classes, or even being onHoliday (ie. there is no chance of me doing some haveFun() over a weekend or of an evening!). I have though about putting haveFun() into an interface, but it seems a bit worthless having an interface with only a single method! Even then I would have to have the method in both of the classes -one for bike and another for car! I have thought about having my onHoliday class accepting any object type, but then I don't want someone accidentally sending in a boat to my onHoliday class (imagine I can't swim, so its not about to happen). It may be important to note that my onHoliday class is package private, and final. It in fact is only accessed via other 'private methods' in other classes, and has only private methods itself. Thanks in advance. David

    Read the article

  • print a linear linked list into a table

    - by user1796970
    I am attempting to print some values i have stored into a LLL into a readable table. The data i have stored is the following : DEBBIE STARR F 3 W 1000.00 JOAN JACOBUS F 9 W 925.00 DAVID RENN M 3 H 4.75 ALBERT CAHANA M 3 H 18.75 DOUGLAS SHEER M 5 W 250.00 SHARI BUCHMAN F 9 W 325.00 SARA JONES F 1 H 7.50 RICKY MOFSEN M 6 H 12.50 JEAN BRENNAN F 6 H 5.40 JAMIE MICHAELS F 8 W 150.00 i have stored each firstname, lastname, gender, tenure, payrate, and salary into their own List. And would like to be able to print them out in the same format that they are viewed on the text file i read them in from. i have messed around with a few methods that allow me to traverse and print the Lists, but i end up with ugly output. . . here is my code for the storage of the text file and the format i would like to print out: public class Payroll { private LineWriter lw; private ObjectList output; ListNode input; private ObjectList firstname, lastname, gender, tenure, rate, salary; public Payroll(LineWriter lw) { this.lw = lw; this.firstname = new ObjectList(); this.lastname = new ObjectList(); this.gender = new ObjectList(); this.tenure = new ObjectList(); this.rate = new ObjectList(); this.salary = new ObjectList(); this.output = new ObjectList(); this.input = new ListNode(); } public void readfile() { File file = new File("payfile.txt"); try{ Scanner scanner = new Scanner(file); while(scanner.hasNextLine()) { String line = scanner.nextLine(); Scanner lineScanner = new Scanner(line); lineScanner.useDelimiter("\\s+"); while(lineScanner.hasNext()) { firstname.insert1(lineScanner.next()); lastname.insert1(lineScanner.next()); gender.insert1(lineScanner.next()); tenure.insert1(lineScanner.next()); rate.insert1(lineScanner.next()); salary.insert1(lineScanner.next()); } } }catch(FileNotFoundException e) {e.printStackTrace();} } public void printer(LineWriter lw) { String msg = " FirstName " + " LastName " + " Gender " + " Tenure " + " Pay Rate " + " Salary "; output.insert1(msg); System.out.println(output.getFirst()); System.out.println(" " + firstname.getFirst() + " " + lastname.getFirst() + "\t" + gender.getFirst() + "\t" + tenure.getFirst() + "\t" + rate.getFirst() + "\t" + salary.getFirst()); } }

    Read the article

  • Populate Multiple PDFs

    - by gmcalab
    I am using itextsharp to populate my PDFs. I have no issues with this. Basically what I am doing is getting the PDF and populating the fields in memory then passing back the MemoryStream to be displayed on a webpage. All this is working with a single document PDF. What I am trying to figure out now, is merging multiple PDFs into one MemoryStream. The part I cant figure out is, the documents I am populating are identical. So for example, I have a List<Person> that contains 5 persons. I want to fill out a PDF for each person and merge them all into one, in memory. Bare in mind I am going to fill out the same type of document for each person. The problem I am getting is that when I try to add a second copy of the same PDF to be filled out for the second iteration, it just overwrites the first populated PDF, since it's the same document, therefore not adding a second copy for the second Person at all. So basically if I had the 5 people, I would end up with a single page with the data of the 5th person, instead of a PDF with 5 like pages that contain the data of each person respectively. Here's some code... MemoryStream ms = ms = new MemoryStream(); PdfReader docReader = null; PdfStamper Stamper = null; List<Person> persons = new List<Person>() { new Person("Larry", "David"), new Person("Dustin", "Byfuglien"), new Person("Patrick", "Kane"), new Person("Johnathan", "Toews"), new Person("Marian", "Hossa") }; try { // Iterate thru all persons and populate a PDF for each foreach(var person in persons){ PdfCopyFields Copier = new PdfCopyFields(ms); Copier.AddDocument(GetReader("Person.pdf")); Copier.Close(); docReader = new PdfReader(ms.ToArray()); Stamper = new PdfStamper(docReader, ms); AcroFields Fields = Stamper.AcroFields; Fields.SetField("FirstName", person.FirstName); } }catch(Exception e){ // handle error }finally{ if (Stamper != null) { Stamper.Close(); } if (docReader != null) { docReader.Close(); } }

    Read the article

  • CSS Footer bar bottom center issue

    - by StealthRT
    Hey all, i am trying to get my bottom bar to center on the screen but i am unable to do so. <style type="text/css"> body { background: #fffff; margin: 0; padding: 0; font: 10px normal Verdana, Arial, Helvetica, sans-serif; } * {margin: 0; padding: 0; outline: none;} #bottomBar { position: fixed; bottom: 0px; left: 0px; z-index: 9999; background: #e3e2e2; border: 1px solid #c3c3c3; border-bottom: none; width: 500px; min-width: 500px; margin: 0px auto; -moz-opacity:.90; filter:alpha(opacity=90); opacity:.90; } *html #bottomBar {margin-top: -1px; position: absolute; top:expression(eval(document.compatMode &&document.compatMode=='CSS1Compat') ?documentElement.scrollTop+(documentElement.clientHeight-this.clientHeight) : document.body.scrollTop +(document.body.clientHeight-this.clientHeight));} #bottomBar ul {padding: 0; margin: 0;float: left;width: 100%;list-style: none;border-top: 1px solid #fff;} #bottomBar ul li{padding: 0; margin: 0;float: left;position: relative;} #bottomBar ul li a{padding: 5px;float: left;text-indent: -9999px;height: 16px; width: 16px;text-decoration: none;color: #333;position: relative;} html #bottomBar ul li a:hover{ background-color: #fff; } a.PDF{background: url(http://www.xxx.com/img/pdficon.png) no-repeat center center; } </style> <div id="bottomBar"> <ul id="mainpanel"> <li style="padding-top:5px; font-family:Arial, Helvetica, sans-serif; padding-left: 5px;">First time here? Be sure to check out the "this" button above or download the PDF here -></li> <li><a href="http://www.xxx.com" class="PDF">Download PDF <small>Download PDF</small></a></li> </ul> </div> Thanks! David

    Read the article

  • Why is my javascript function sometimes "not defined"?

    - by harpo
    Problem: I call my javascript function, and sometimes I get the error 'myFunction is not defined'. But it is defined. For example. I'll occasionally get 'copyArray is not defined' even in this example: function copyArray( pa ) { var la = []; for (var i=0; i < pa.length; i++) la.push( pa[i] ); return la; } Function.prototype.bind = function( po ) { var __method = this; var __args = []; // sometimes errors -- in practice I inline the function as a workaround __args = copyArray( arguments ); return function() { /* bind logic omitted for brevity */ } } As you can see, copyArray is defined right there, so this can't be about the order in which script files load. I've been getting this in situations that are harder to work around, where the calling function is located in another file that should be loaded after the called function. But this was the simplest case I could present, and appears to be the same problem. It doesn't happen 100% of the time, so I do suspect some kind of load-timing-related problem. But I have no idea what. @Hojou: That's part of the problem. The function in which I'm now getting this error is itself my addLoadEvent, which is basically a standard version of the common library function. @James: I understand that, and there is no syntax error in the function. When that is the case, the syntax error is reported as well. In this case, I am getting only the 'not defined' error. @David: The script in this case resides in an external file that is referenced using the normal <script src="file.js"></script> method in the page's head section. @Douglas: Interesting idea, but if this were the case, how could we ever call a user-defined function with confidence? In any event, I tried this and it didn't work. @sk: This technique has been tested across browsers and is basically copied from the prototype library.

    Read the article

  • How can the DataView object reference not be set?

    - by dboarman-FissureStudios
    I have the following sample where the SourceData class would represent a DataView resulting from an Sql query: class MainClass { private static SourceData Source; private static DataView View; private static DataView Destination; public static void Main (string[] args) { Source = new SourceData(); View = new DataView(Source.Table); Destination = new DataView(); Source.AddRowData("Table1", 100); Source.AddRowData("Table2", 1500); Source.AddRowData("Table3", 1300324); Source.AddRowData("Table4", 1122494); Source.AddRowData("Table5", 132545); Console.WriteLine(String.Format("Data View Records: {0}", View.Count)); foreach(DataRowView drvRow in View) { Console.WriteLine(String.Format("Source {0} has {1} records.", drvRow["table"], drvRow["records"])); DataRowView newRow = Destination.AddNew(); newRow["table"] = drvRow["table"]; newRow["records"] = drvRow["records"]; } Console.WriteLine(); Console.WriteLine(String.Format("Destination View Records: {0}", Destination.Count)); foreach(DataRowView drvRow in Destination) { Console.WriteLine(String.Format("Destination {0} has {1} records.", drvRow["table"], drvRow["records"])); } } } class SourceData { public DataTable Table { get{return dataTable;} } private DataTable dataTable; public SourceData() { dataTable = new DataTable("TestTable"); dataTable.Columns.Add("table", typeof(string)); dataTable.Columns.Add("records", typeof(int)); } public void AddRowData(string tableName, int tableRows) { dataTable.Rows.Add(tableName, tableRows); } } My output is: Data View Records: 5 Source Table1 has 100 records. Unhandled Exception: System.NullReferenceException: Object reference not set to an instance of an object at System.Data.DataView.AddNew () [0x0003e] in /usr/src/packages/BUILD/mono-2.4.2.3 /mcs/class/System.Data/System.Data/DataView.cs:344 at DataViewTest.MainClass.Main (System.String[] args) [0x000e8] in /home/david/Projects/DataViewTest/SourceData.cs:29 I did some reading here: DataView:AddNew Method... ...and it would appear that I am doing this the right way. How come I am getting the Object reference not set?

    Read the article

  • MySQL-problem when baking with CakePHP.

    - by timkl
    I am currently reading "Beginning CakePHP:From Novice to Professional" by David Golding. At one point I have to use the CLI-command "cake bake", I get the welcome-screen but when I try to bake e.g. a Controller I get the following error messages: Warning: mysql_connect(): Can't connect to local MySQL server through socket '/var/mysql/mysql.sock' (2) in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 117 Warning: mysql_select_db(): supplied argument is not a valid MySQL-Link resource in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 122 Warning: mysql_get_server_info(): supplied argument is not a valid MySQL-Link resource in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 130 Warning: mysql_query(): supplied argument is not a valid MySQL-Link resource in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 154 Error: Your database does not have any tables. I suspect that the error-messages has to do with php trying to access the wrong mysql-socket, namely the default osx mysql-socket - instead of the one that MAMP uses. Hence I change my database configurations to connect to the UNIX mysql-socket (:/Applications/MAMP/tmp/mysql/mysql.sock): class DATABASE_CONFIG { var $default = array( 'driver' => 'mysql', 'connect' => 'mysql_connect', 'persistent' => false, 'host' =>':/Applications/MAMP/tmp/mysql/mysql.sock', // UNIX MySQL-socket 'login' => 'my_user', 'password' => 'my_pass', 'database' => 'blog', 'prefix' => '', ); } But I get the same error-messages with the new socket: Warning: mysql_connect(): Can't connect to local MySQL server through socket '/Applications/MAMP/tmp/mysql/mysql.sock:3306' (2) in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 117 Warning: mysql_select_db(): supplied argument is not a valid MySQL-Link resource in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 122 Warning: mysql_get_server_info(): supplied argument is not a valid MySQL-Link resource in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 130 Warning: mysql_query(): supplied argument is not a valid MySQL-Link resource in /Applications/MAMP/htdocs/blog/cake/libs/model/datasources/dbo/dbo_mysql.php on line 154 Error: Your database does not have any tables. Also, even though I use the UNIX-socket that MAMP show on it's welcome-screen, CakePHP loses the database-connection, when using this socket instead of localhost. Any ideas on how I can get bake to work?

    Read the article

  • Grouping geographical shapes

    - by grenade
    I am using Dundas Maps and attempting to draw a map of the world where countries are grouped into regions that are specific to a business implementation. I have shape data (points and segments) for each country in the world. I can combine countries into regions by adding all points and segments for countries within a region to a new region shape. foreach(var region in GetAllRegions()){ var regionShape = new Shape { Name = region.Name }; foreach(var country in GetCountriesInRegion(region.Id)){ var countryShape = GetCountryShape(country.Id); regionShape.AddSegments(countryShape.ShapeData.Points, countryShape.ShapeData.Segments); } map.Shapes.Add(regionShape); } The problem is that the country border lines still show up within a region and I want to remove them so that only regional borders show up. Dundas polygons must start and end at the same point. This is the case for all the country shapes. Now I need an algorithm that can: Determine where country borders intersect at a regional border, so that I can join the regional border segments. Determine which country borders are not regional borders so that I can discard them. Sort the resulting regional points so that they sequentialy describe the shape boundaries. Below is where I have gotten to so far with the map. You can see that the country borders still need to be removed. For example, the border between Mongolia and China should be discarded whereas the border between Mongolia and Russia should be retained. The reason I need to retain a regional border is that the region colors will be significant in conveying information but adjacent regions may be the same color. The regions can change to include or exclude countries and this is why the regional shaping must be dynamic. EDIT: I now know that I what I am looking for is a UNION of polygons. David Lean explains how to do it using the spatial functions in SQL Server 2008 which might be an option but my efforts have come to a halt because the resulting polygon union is so complex that SQL truncates it at 43,680 characters. I'm now trying to either find a workaround for that or find a way of doing the union in code.

    Read the article

  • How to use Crtl in a Delphi unit in a C++Builder project? (or link to C++Builder C runtime library)

    - by Craig Peterson
    I have a Delphi unit that is statically linking a C .obj file using the {$L xxx} directive. The C file is compiled with C++Builder's command line compiler. To satisfy the C file's runtime library dependencies (_assert, memmove, etc), I'm including the crtl unit Allen Bauer mentioned here. unit FooWrapper; interface implementation uses Crtl; // Part of the Delphi RTL {$L FooLib.obj} // Compiled with "bcc32 -q -c foolib.c" procedure Foo; cdecl; external; end. If I compile that unit in a Delphi project (.dproj) everthing works correctly. If I compile that unit in a C++Builder project (.cbproj) it fails with the error: [ILINK32 Error] Fatal: Unable to open file 'CRTL.OBJ' And indeed, there isn't a crtl.obj file in the RAD Studio install folder. There is a .dcu, but no .pas. Trying to add crtdbg to the uses clause (the C header where _assert is defined) gives an error that it can't find crtdbg.dcu. If I remove the uses clause, it instead fails with errors that __assert and _memmove aren't found. So, in a Delphi unit in a C++Builder project, how can I export functions from the C runtime library so they're available for linking? I'm already aware of Rudy Velthuis's article. I'd like to avoid manually writing Delphi wrappers if possible, since I don't need them in Delphi, and C++Builder must already include the necessary functions. Edit For anyone who wants to play along at home, the code is available in Abbrevia's Subversion repository at https://tpabbrevia.svn.sourceforge.net/svnroot/tpabbrevia/trunk. I've taken David Heffernan's advice and added a "AbCrtl.pas" unit that mimics crtl.dcu when compiled in C++Builder. That got the PPMd support working, but the Lzma and WavPack libraries both fail with link errors: [ILINK32 Error] Error: Unresolved external '_beginthreadex' referenced from ABLZMA.OBJ [ILINK32 Error] Error: Unresolved external 'sprintf' referenced from ABWAVPACK.OBJ [ILINK32 Error] Error: Unresolved external 'strncmp' referenced from ABWAVPACK.OBJ [ILINK32 Error] Error: Unresolved external '_ftol' referenced from ABWAVPACK.OBJ AFAICT, all of them are declared correctly, and the _beginthreadex one is actually declared in AbLzma.pas, so it's used by the pure Delphi compile as well. To see it yourself, just download the trunk (or just the "source" and "packages" directories), disable the {$IFDEF BCB} block at the bottom of AbDefine.inc, and try to compile the C++Builder "Abbrevia.cbproj" project.

    Read the article

  • Facebook Like not Pulling Title

    - by matthewb
    I am having some issues with the new facebook like button. It shows up fine, however for some reason it's not pulling the title of the page. I have the and I am using og:title, all are filled in, when I view the source of the iframe created on the load of he button, the in there is blank. I am also trying to put the tweetmeme and that's seeing that title and not the meta, or the normal page title. What am I doing wrong? <html xmlns="http://www.w3.org/1999/xhtml" xmlns:og="http://opengraphprotocol.org/schema/" xmlns:fb="http://developers.facebook.com/schema/"> <fb:like href="<?=$url?>" layout="button_count" show_faces="false" width="100" font="arial"></fb:like> UPDATE: Complete Head <head> <title>Ladder 15 Gets A New Menu!</title> <meta content="Growing up can be hard to do, especially in the Mad River family. But Ladder 15 has come into its own over the winter.&nbsp; With some new cocktails, wine selection, a hefty new beer list and veteran Chef David Ansill in the kitchen, you can check your fist pump at" name="Description"> <meta content="" name="Keywords"> <meta content="cities2night inc." name="author"> <meta content="Cities2Night 2010" name="copyright"> <meta content="en-us" name="language"> <meta content="General" name="rating"> <meta content="index,follow" name="robots"> <meta content="text/html; charset=UTF-8" http-equiv="Content-Type"> <meta content="Ladder 15 gets a new menu!" name="tweetmeme-title"> <meta content="Ladder 15 gets a new menu!" property="og:title"> <meta content="article" property="og:type"> <meta content="http://philly.cities2night.com/articles/show/listing/11/ladder-15-gets-a-new-menu" property="og:url"> <meta content="http://philly.cities2night.com/public/article_images/11.jpg" property="og:image"> <meta content="c0176da0ec38aaf107c0ef6c8cdeee38" property="fb:app_id"> <meta content="Philly2night.com" property="og:site_name"> <meta content="Growing up can be hard to do, especially in the Mad River family. But Ladder" property="og:description"></head>

    Read the article

  • Estimating the boundary of arbitrarily distributed data

    - by Dave
    I have two dimensional discrete spatial data. I would like to make an approximation of the spatial boundaries of this data so that I can produce a plot with another dataset on top of it. Ideally, this would be an ordered set of (x,y) points that matplotlib can plot with the plt.Polygon() patch. My initial attempt is very inelegant: I place a fine grid over the data, and where data is found in a cell, a square matplotlib patch is created of that cell. The resolution of the boundary thus depends on the sampling frequency of the grid. Here is an example, where the grey region are the cells containing data, black where no data exists. OK, problem solved - why am I still here? Well.... I'd like a more "elegant" solution, or at least one that is faster (ie. I don't want to get on with "real" work, I'd like to have some fun with this!). The best way I can think of is a ray-tracing approach - eg: from xmin to xmax, at y=ymin, check if data boundary crossed in intervals dx y=ymin+dy, do 1 do 1-2, but now sample in y An alternative is defining a centre, and sampling in r-theta space - ie radial spokes in dtheta increments. Both would produce a set of (x,y) points, but then how do I order/link neighbouring points them to create the boundary? A nearest neighbour approach is not appropriate as, for example (to borrow from Geography), an isthmus (think of Panama connecting N&S America) could then close off and isolate regions. This also might not deal very well with the holes seen in the data, which I would like to represent as a different plt.Polygon. The solution perhaps comes from solving an area maximisation problem. For a set of points defining the data limits, what is the maximum contiguous area contained within those points To form the enclosed area, what are the neighbouring points for the nth point? How will the holes be treated in this scheme - is this erring into topology now? Apologies, much of this is me thinking out loud. I'd be grateful for some hints, suggestions or solutions. I suspect this is an oft-studied problem with many solution techniques, but I'm looking for something simple to code and quick to run... I guess everyone is, really! Cheers, David

    Read the article

  • NHibernate Many to Many delete all my data in the table

    - by Daoming Yang
    I would love to thank @Stefan Steinegger and @David helped me out yesterday with many-to-many mapping. I have 3 tables which are "News", "Tags" and "News_Tags" with Many-To-Many relationship and the "News_Tags" is the link table. If I delete one of the news records, the following mappings will delete all my news records which have the same tags. One thing I need to notice, I only allowed unique tag stored in the "Tag" table. This mapping make sense for me, it will delete the tag and related News records, but how can I implement a tagging system with NHibernate? Can anyone give me some suggestion? Many thanks. Daoming. News Mapping: <class name="New" table="News" lazy="false"> <id name="NewID"> <generator class="identity" /> </id> <property name="Title" type="String"></property> <property name="Description" type="String"></property> <set name="TagsList" table="New_Tags" lazy="false" inverse="true" cascade="all"> <key column="NewID" /> <many-to-many class="Tag" column="TagID" /> </set> </class> Tag Mapping: <class name="Tag" table="Tags" lazy="false"> <id name="TagID"> <generator class="identity" /> </id> <property name="TagName" type="String"></property> <property name="DateCreated" type="DateTime"></property> <!--inverse="true" has been defined in the "News mapping"--> <set name="NewsList" table="New_Tags" lazy="false" cascade="all"> <key column="TagID" /> <many-to-many class="New" column="NewID" /> </set> </class>

    Read the article

  • Javascript and VERY LONG string

    - by StealthRT
    Hey all, i am having problems with the below code: function showTableData() { var tableArray; var x = 0; var theHTML; for (i = 0; i < 7032; i++) { if (x = 0) { theHTML = '<tr>' + '<th scope="row" class="spec">' + partNum[i] + '</th>' + '<td>' + Msrp[i] + '</td>' + '<td>' + blah[i] + '</td>' + '<td>' + blahs[i] + '</td>' + '</tr>' + theHTML; x++; }else{ theHTML = '<tr>' + '<th scope="row" class="specalt">' + partNum[i] + '</th>' + '<td class="alt">' + Msrp[i] + '</td>' + '<td class="alt">' + blah[i] + '</td>' + '<td class="alt">' + blahs[i] + '</td>' + '</tr>' + theHTML; x--; } } theHTML = '<table id="mytable" cellspacing="0">' + '<tr>' + '<th scope="col" abbr="Configurations" class="nobg">Part Number</th>' + '<th scope="col" abbr="Dual 1.8">Msrp Price</th>' + '<th scope="col" abbr="Dual 2">blahs Price</th>' + '<th scope="col" abbr="Dual 2.5">Low Price</th>' + '</tr>' + theHTML + '</table>'; $('#example').append(theHTML); } </script> <div id="example"> </div> The problem being that the $('#example').append(theHTML); never executes (or shows on the page). I think its because the string is soooooo long! It has over 7,000 items in the array so im not sure if thats the reason or if its something else? Any help would be great! Thanks! David

    Read the article

  • How to iterate a list inside a list in java?

    - by user2142786
    Hi i have two value object classes . package org.array; import java.util.List; public class Father { private String name; private int age ; private List<Children> Childrens; public String getName() { return name; } public void setName(String name) { this.name = name; } public int getAge() { return age; } public void setAge(int age) { this.age = age; } public List<Children> getChildrens() { return Childrens; } public void setChildrens(List<Children> childrens) { Childrens = childrens; } } second is for children package org.array; public class Children { private String name; private int age ; public String getName() { return name; } public void setName(String name) { this.name = name; } public int getAge() { return age; } public void setAge(int age) { this.age = age; } } and i want to print there value i nested a list inside a list here i am putting only a single value inside the objects while in real i have many values . so i am nesting list of children inside father list. how can i print or get the value of child and father both. here is my logic. package org.array; import java.util.ArrayList; import java.util.Iterator; import java.util.List; public class ArrayDemo { public static void main(String[] args) { List <Father> fatherList = new ArrayList<Father>(); Father father = new Father(); father.setName("john"); father.setAge(25); fatherList.add(father); List <Children> childrens = new ArrayList<Children>(); Children children = new Children(); children.setName("david"); children.setAge(2); childrens.add(children); father.setChildrens(childrens); fatherList.add(father); Iterator<Father> iterator = fatherList.iterator(); while (iterator.hasNext()) { System.out.println(iterator.toString()); } } }

    Read the article

  • Conversation as User Assistance

    - by ultan o'broin
    Applications User Experience members (Erika Web, Laurie Pattison, and I) attended the User Assistance Europe Conference in Stockholm, Sweden. We were impressed with the thought leadership and practical application of ideas in Anne Gentle's keynote address "Social Web Strategies for Documentation". After the conference, we spoke with Anne to explore the ideas further. Anne Gentle (left) with Applications User Experience Senior Director Laurie Pattison In Anne's book called Conversation and Community: The Social Web for Documentation, she explains how user assistance is undergoing a seismic shift. The direction is away from the old print manuals and online help concept towards a web-based, user community-driven solution using social media tools. User experience professionals now have a vast range of such tools to start and nurture this "conversation": blogs, wikis, forums, social networking sites, microblogging systems, image and video sharing sites, virtual worlds, podcasts, instant messaging, mashups, and so on. That user communities are a rich source of user assistance is not a surprise, but the extent of available assistance is. For example, we know from the Consortium for Service Innovation that there has been an 'explosion' of user-generated content on the web. User-initiated community conversations provide as much as 30 times the number of official help desk solutions for consortium members! The growing reliance on user community solutions is clearly a user experience issue. Anne says that user assistance as conversation "means getting closer to users and helping them perform well. User-centered design has been touted as one of the most important ideas developed in the last 20 years of workplace writing. Now writers can take the idea of user-centered design a step further by starting conversations with users and enabling user assistance in interactions." Some of Anne's favorite examples of this paradigm shift from the world of traditional documentation to community conversation include: Writer Bob Bringhurst's blog about Adobe InDesign and InCopy products and Adobe's community help The Microsoft Development Network Community Center ·The former Sun (now Oracle) OpenDS wiki, NetBeans Ruby and other community approaches to engage diverse audiences using screencasts, wikis, and blogs. Cisco's customer support wiki, EMC's community, as well as Symantec and Intuit's approaches The efforts of Ubuntu, Mozilla, and the FLOSS community generally Adobe Writer Bob Bringhurst's Blog Oracle is not without a user community conversation too. Besides the community discussions and blogs around documentation offerings, we have the My Oracle Support Community forums, Oracle Technology Network (OTN) communities, wiki, blogs, and so on. We have the great work done by our user groups and customer councils. Employees like David Haimes reach out, and enthusiastic non-employee gurus like Chet Justice (OracleNerd), Floyd Teter and Eddie Awad provide great "how-to" information too. But what does this paradigm shift mean for existing technical writers as users turn away from the traditional printable PDF manual deliverables? We asked Anne after the conference. The writer role becomes one of conversation initiator or enabler. The role evolves, along with the process, as the users define their concept of user assistance and terms of engagement with the product instead of having it pre-determined. It is largely a case now of "inventing the job while you're doing it, instead of being hired for it" Anne said. There is less emphasis on formal titles. Anne mentions that her own title "Content Stacker" at OpenStack; others use titles such as "Content Curator" or "Community Lead". However, the role remains one essentially about communications, "but of a new type--interacting with users, moderating, curating content, instead of sitting down to write a manual from start to finish." Clearly then, this role is open to more than professional technical writers. Product managers who write blogs, developers who moderate forums, support professionals who update wikis, rock star programmers with a penchant for YouTube are ideal. Anyone with the product knowledge, empathy for the user, and flair for relationships on the social web can join in. Some even perform these roles already but do not realize it. Anne feels the technical communicator space will move from hiring new community conversation professionals (who are already active in the space through blogging, tweets, wikis, and so on) to retraining some existing writers over time. Our own research reveals that the established proponents of community user assistance even set employee performance objectives for internal content curators about the amount of community content delivered by people outside the organization! To take advantage of the conversations on the web as user assistance, enterprises must first establish where on the spectrum their community lies. "What is the line between community willingness to contribute and the enterprise objectives?" Anne asked. "The relationship with users must be managed and also measured." Anne believes that the process can start with a "just do it" approach. Begin by reaching out to existing user groups, individual bloggers and tweeters, forum posters, early adopter program participants, conference attendees, customer advisory board members, and so on. Use analytical tools to measure the level of conversation about your products and services to show a return on investment (ROI), winning management support. Anne emphasized that success with the community model is dependent on lowering the technical and motivational barriers so that users can readily contribute to the conversation. Simple tools must be provided, and guidelines, if any, must be straightforward but not mandatory. The conversational approach is one where traditional style and branding guides do not necessarily apply. Tools and infrastructure help users to create content easily, to search and find the information online, read it, rate it, translate it, and participate further in the content's evolution. Recognizing contributors by using ratings on forums, giving out Twitter kudos, conference invitations, visits to headquarters, free products, preview releases, and so on, also encourages the adoption of the conversation model. The move to conversation as user assistance is not free, but there is a business ROI. The conversational model means that customer service is enhanced, as user experience moves from a functional to a valued, emotional level. Studies show a positive correlation between loyalty and financial performance (Consortium for Service Innovation, 2010), and as customer experience and loyalty become key differentiators, user experience professionals cannot explore the model's possibilities. The digital universe (measured at 1.2 million petabytes in 2010) is doubling every 12 to 18 months, and 70 percent of that universe consists of user-generated content (IDC, 2010). Conversation as user assistance cannot be ignored but must be embraced. It is a time to manage for abundance, not scarcity. Besides, the conversation approach certainly sounds more interesting, rewarding, and fun than the traditional model! I would like to thank Anne for her time and thoughts, and recommend that all user assistance professionals read her book. You can follow Anne on Twitter at: http://www.twitter.com/annegentle. Oracle's Acrolinx IQ deployment was used to author this article.

    Read the article

  • December release of Microsoft All-In-One Code Framework is available now.

    - by Jialiang
    The code samples in Microsoft All-In-One Code Framework are updated on 2010-12-13. Download address: http://1code.codeplex.com/releases/view/57459#DownloadId=185534 Updated code sample index categorized by technologies: http://1code.codeplex.com/wikipage?title=All-In-One%20Code%20Framework%20Sample%20Catalog (it also allows you to download individual code samples instead of the entire All-In-One Code Framework sample package.) If it’s the first time that you hear about Microsoft All-In-One Code Framework, please watch the introduction video on YouTube http://www.youtube.com/watch?v=cO5Li3APU58, or read the introduction on our homepage http://1code.codeplex.com/,  and this Port25 article http://port25.technet.com/archive/2010/01/18/the-all-in-one-code-framework.aspx.  -------------- New ASP.NET Code Samples VBASPNETAJAXWebChat and CSASPNETAJAXWebChat Most of you have some experience in chatting with friends on the web. So you may want to know how to make a web chat application, it seems to be quite complicated. But ASP.NET gives you the power to buiild a chat room easily. In this code sample, we will construct our own web chat room with the amazing AJAX feature. The principle is simple relatively. As we all know, a base chat application need 4 base controls: one List control to show the chat room members, one List control to show the message list, one TextBox control to input messages and one button to send message. User inputs his message in the textbox first and then presses Send button, it will send the message to the server. The message list will update every 2 seconds to get the newest message list in the chat room from the server. We need to know, it is hard for us to make an AJAX web chat application like a windows form application because we cannot keep the connection after one web request ended. So a lot of events which communicates between client side and server side cannot be realized. The common workaround is to make web requests in every some seconds to check whether the server side has been updated. But another technique called COMET makes it possible. But it is different with AJAX and will not be talked in details in this KB. For more details about COMET, we can get some clues from the Reference.   CSASPNETCurrentOnlineUserList and VBASPNETCurrentOnlineUserList This sample demos a system that needs to display a list of current online users' information. As a matter of fact, Membership.GetNumberOfUsersOnline Method  can get the number of online users and there is a convenient approach to check whether the user is online by using Membership.GetUser(string userName).IsOnline property,however many asp.net projects are not using membership.So in this case,the sample shows how to display a list of current online users' information without using membership provider. It is not difficult to check whether the user is online by using session.Many projects tend to be used “Session_End” event to mark a user as “Offline”,however ,it may not be a good idea,because it can’t detect the user status accurately. In addition, "Session_End" event is only available in the "InProc" session mode. If you are storing session states in the State Server or SQL Server, "Session_End" event will never fire. To handle this issue, we need to save the user online status to a  global DataTable or  DataBase. In the sample application, define a global DataTable to store the information of online users.Use XmlHttpRequest in the pages to update and check user's last active time at intervals and also retrieve information on how many users are still online. The sample project can auto delete offline users' information from a global DataTable by checking users’ last active time. A step-by-step guide illustrating how to display a list of current online users' information without using membership provider: 1. Login page. Let user sign in and add current user’s information to a global datatable while Initialize the global datatable which used to store information of current online users. 2. Current online user list page. Use XmlHttpRequest in this page to update and check user's last active time at intervals and also retrieve information on how many users are still online. 3. If user closes the page without clicking  the sign out link button ,the sample project can auto mark the user as offline and delete offline users' information from a global DataTable which used to store information of current online users  by checking users’ last active time. Then the current online user list will be like this:   CSASPNETIPtoLocation This sample demonstrates how to find the geographical location from an IP address. As we know, it is not hard for us to get the IP address of visitors via Request.ServerVariable property, but it is really difficult for us to know where they come from. To achieve this feature, the sample uses a free third party web service from http://freegeoip.appspot.com/, which returns the information about an IP address we send to the server in the format of XML, JSON or CSV. It makes all things easier.   CSASPNETBackgroundWorker Sometimes we do an operation which needs long time to complete. It will stop the response and the page is blank until the operation finished. In this case, we want the operation to run in the background, and in the page, we want to display the progress of the running operation. Therefore, the user can know the operation is running and can know the progress. CSASPNETInheritingFromTreeNode In windows forms TreeView, each tree node has a property called "Tag" which can be used to store a custom object. Many customers want to implement the same tag feature in ASP.NET TreeView. This project creates a custom TreeView control named "CustomTreeView" to achieve this goal. CSASPNETRemoteUploadAndDownload and VBASPNETRemoteUploadAndDownload This code sample was created in response to a code sample request in our new code sample request frunction for customers. The code samples demonstrate uploading files to and downloading files from a remote HTTP or FTP server. In .NET Framework 2.0 and higher versions, there are some lightweight class libraries which support HTTP and FTP protocol transmission. By using these classes, we can achieve this programming requirement.   CSASPNETImageEditUpload and VBASPNETImageEditUpload This demo will shows how to insert, edit and update a common image with the type of "jpg", "png", "gif" or "bmp" . We mainly use two different SqlDataSources with the same database to bind to GridView and FormView in order to establish the “cascading” effort. Besides we apply our self-made ImageHanlder to encoding or decoding images of different types, and use context to output the stream of images. We will explicitly assign the binary streams of images through the event of “FormView_ItemInserting” or “Form_ItemUpdating” to synchronize the stream both in what we can see on an aspx page as well as in what’s really stored in the database.   WebBrowser Control, Network and other Windows General New Code Samples   CSWebBrowserSuppressError and VBWebBrowserSuppressError The sample demonstrates how to make WebBrowser suppress errors, such as script error, navigation error and so on.   CSWebBrowserWithProxy and VBWebBrowserWithProxy The sample demonstrates how to make WebBrowser use a proxy server.   CSWebDownloadProgress and VBWebDownloadProgress The sample demonstrates how to show progress during the download. It also supplies the features to Start, Pause, Resume and Cancel a download.   CppSetDesktopWallpaper, CSSetDesktopWallpaper and VBSetDesktopWallpaper This code sample application allows you select an image, view a preview (resized smaller to fit if necessary), select a display style among Tile, Center, Stretch, Fit (Windows 7 and later) and Fill (Windows 7 and later), and set the image as the Desktop wallpaper. CSWindowsServiceRecoveryProperty and VBWindowsServiceRecoveryProperty CSWindowsServiceRecoveryProperty example demonstrates how to use ChangeServiceConfig2 to configure the service "Recovery" properties in C#. This example operates all the options you can see on the service "Recovery" tab, including setting the "Enable actions for stops with errors" option in Windows Vista and later operating systems. This example also include how to grant the shut down privilege to the process, so that we can configure a special option in the "Recovery" tab - "Restart Computer Options...".   New Office Development Code Samples   CSOneNoteRibbonAddIn and VBOneNoteRibbonAddIn The code sample demonstrates a OneNote 2010 COM add-in that implements IDTExtensibility2. The add-in also supports customizing the Ribbon by implementing the IRibbonExtensibility interface. It is a skeleton OneNote add-in that developers can extend it to implement more functions. The code sample was requested by a customer in our code sample request service. We expect that this could help developers in the community.   New Windows Shell Code Samples   CppShellExtPreviewHandler, CSShellExtPreviewHandler and VBShellExtPreviewHandler In the past two months, we released the code samples of Windows Context Menu Handler, Infotip Handler, and Thumbnail Handler. This is the fourth part of the shell extension series: Preview Handler. The code samples demo the C++, C# and VB.NET implementation of a preview handler for a new file type registered with the .recipe extension. Preview handlers are called when an item is selected to show a lightweight, rich, read-only preview of the file's contents in the view's reading pane. This is done without launching the file's associated application. Windows Vista and later operating systems support preview handlers. To be a valid preview handler, several interfaces must be implemented. This includes IPreviewHandler (shobjidl.h); IInitializeWithFile, IInitializeWithStream, or IInitializeWithItem (propsys.h); IObjectWithSite (ocidl.h); and IOleWindow (oleidl.h). There are also optional interfaces, such as IPreviewHandlerVisuals (shobjidl.h), that a preview handler can implement to provide extended support. Windows API Code Pack for Microsoft .NET Framework makes the implementation of these interfaces very easy in .NET. The example preview handler provides previews for .recipe files. The .recipe file type is simply an XML file registered as a unique file name extension. It includes the title of the recipe, its author, difficulty, preparation time, cook time, nutrition information, comments, an embedded preview image, and so on. The preview handler extracts the title, comments, and the embedded image, and display them in a preview window.   In response to many customers' request, we added setup projects in every shell extension samples in this release. Those setup projects allow you to deploy the shell extensions to your end users' machines. ---------- Download address: http://1code.codeplex.com/releases/view/57459#DownloadId=185534 Updated code sample index categorized by technologies: http://1code.codeplex.com/wikipage?title=All-In-One%20Code%20Framework%20Sample%20Catalog (it also allows you to download individual code samples instead of the entire All-In-One Code Framework sample package.) If you have any feedback for us, please email: [email protected]. We look forward to your comments.

    Read the article

  • Heaps of Trouble?

    - by Paul White NZ
    If you’re not already a regular reader of Brad Schulz’s blog, you’re missing out on some great material.  In his latest entry, he is tasked with optimizing a query run against tables that have no indexes at all.  The problem is, predictably, that performance is not very good.  The catch is that we are not allowed to create any indexes (or even new statistics) as part of our optimization efforts. In this post, I’m going to look at the problem from a slightly different angle, and present an alternative solution to the one Brad found.  Inevitably, there’s going to be some overlap between our entries, and while you don’t necessarily need to read Brad’s post before this one, I do strongly recommend that you read it at some stage; he covers some important points that I won’t cover again here. The Example We’ll use data from the AdventureWorks database, copied to temporary unindexed tables.  A script to create these structures is shown below: CREATE TABLE #Custs ( CustomerID INTEGER NOT NULL, TerritoryID INTEGER NULL, CustomerType NCHAR(1) COLLATE SQL_Latin1_General_CP1_CI_AI NOT NULL, ); GO CREATE TABLE #Prods ( ProductMainID INTEGER NOT NULL, ProductSubID INTEGER NOT NULL, ProductSubSubID INTEGER NOT NULL, Name NVARCHAR(50) COLLATE SQL_Latin1_General_CP1_CI_AI NOT NULL, ); GO CREATE TABLE #OrdHeader ( SalesOrderID INTEGER NOT NULL, OrderDate DATETIME NOT NULL, SalesOrderNumber NVARCHAR(25) COLLATE SQL_Latin1_General_CP1_CI_AI NOT NULL, CustomerID INTEGER NOT NULL, ); GO CREATE TABLE #OrdDetail ( SalesOrderID INTEGER NOT NULL, OrderQty SMALLINT NOT NULL, LineTotal NUMERIC(38,6) NOT NULL, ProductMainID INTEGER NOT NULL, ProductSubID INTEGER NOT NULL, ProductSubSubID INTEGER NOT NULL, ); GO INSERT #Custs ( CustomerID, TerritoryID, CustomerType ) SELECT C.CustomerID, C.TerritoryID, C.CustomerType FROM AdventureWorks.Sales.Customer C WITH (TABLOCK); GO INSERT #Prods ( ProductMainID, ProductSubID, ProductSubSubID, Name ) SELECT P.ProductID, P.ProductID, P.ProductID, P.Name FROM AdventureWorks.Production.Product P WITH (TABLOCK); GO INSERT #OrdHeader ( SalesOrderID, OrderDate, SalesOrderNumber, CustomerID ) SELECT H.SalesOrderID, H.OrderDate, H.SalesOrderNumber, H.CustomerID FROM AdventureWorks.Sales.SalesOrderHeader H WITH (TABLOCK); GO INSERT #OrdDetail ( SalesOrderID, OrderQty, LineTotal, ProductMainID, ProductSubID, ProductSubSubID ) SELECT D.SalesOrderID, D.OrderQty, D.LineTotal, D.ProductID, D.ProductID, D.ProductID FROM AdventureWorks.Sales.SalesOrderDetail D WITH (TABLOCK); The query itself is a simple join of the four tables: SELECT P.ProductMainID AS PID, P.Name, D.OrderQty, H.SalesOrderNumber, H.OrderDate, C.TerritoryID FROM #Prods P JOIN #OrdDetail D ON P.ProductMainID = D.ProductMainID AND P.ProductSubID = D.ProductSubID AND P.ProductSubSubID = D.ProductSubSubID JOIN #OrdHeader H ON D.SalesOrderID = H.SalesOrderID JOIN #Custs C ON H.CustomerID = C.CustomerID ORDER BY P.ProductMainID ASC OPTION (RECOMPILE, MAXDOP 1); Remember that these tables have no indexes at all, and only the single-column sampled statistics SQL Server automatically creates (assuming default settings).  The estimated query plan produced for the test query looks like this (click to enlarge): The Problem The problem here is one of cardinality estimation – the number of rows SQL Server expects to find at each step of the plan.  The lack of indexes and useful statistical information means that SQL Server does not have the information it needs to make a good estimate.  Every join in the plan shown above estimates that it will produce just a single row as output.  Brad covers the factors that lead to the low estimates in his post. In reality, the join between the #Prods and #OrdDetail tables will produce 121,317 rows.  It should not surprise you that this has rather dire consequences for the remainder of the query plan.  In particular, it makes a nonsense of the optimizer’s decision to use Nested Loops to join to the two remaining tables.  Instead of scanning the #OrdHeader and #Custs tables once (as it expected), it has to perform 121,317 full scans of each.  The query takes somewhere in the region of twenty minutes to run to completion on my development machine. A Solution At this point, you may be thinking the same thing I was: if we really are stuck with no indexes, the best we can do is to use hash joins everywhere. We can force the exclusive use of hash joins in several ways, the two most common being join and query hints.  A join hint means writing the query using the INNER HASH JOIN syntax; using a query hint involves adding OPTION (HASH JOIN) at the bottom of the query.  The difference is that using join hints also forces the order of the join, whereas the query hint gives the optimizer freedom to reorder the joins at its discretion. Adding the OPTION (HASH JOIN) hint results in this estimated plan: That produces the correct output in around seven seconds, which is quite an improvement!  As a purely practical matter, and given the rigid rules of the environment we find ourselves in, we might leave things there.  (We can improve the hashing solution a bit – I’ll come back to that later on). Faster Nested Loops It might surprise you to hear that we can beat the performance of the hash join solution shown above using nested loops joins exclusively, and without breaking the rules we have been set. The key to this part is to realize that a condition like (A = B) can be expressed as (A <= B) AND (A >= B).  Armed with this tremendous new insight, we can rewrite the join predicates like so: SELECT P.ProductMainID AS PID, P.Name, D.OrderQty, H.SalesOrderNumber, H.OrderDate, C.TerritoryID FROM #OrdDetail D JOIN #OrdHeader H ON D.SalesOrderID >= H.SalesOrderID AND D.SalesOrderID <= H.SalesOrderID JOIN #Custs C ON H.CustomerID >= C.CustomerID AND H.CustomerID <= C.CustomerID JOIN #Prods P ON P.ProductMainID >= D.ProductMainID AND P.ProductMainID <= D.ProductMainID AND P.ProductSubID = D.ProductSubID AND P.ProductSubSubID = D.ProductSubSubID ORDER BY D.ProductMainID OPTION (RECOMPILE, LOOP JOIN, MAXDOP 1, FORCE ORDER); I’ve also added LOOP JOIN and FORCE ORDER query hints to ensure that only nested loops joins are used, and that the tables are joined in the order they appear.  The new estimated execution plan is: This new query runs in under 2 seconds. Why Is It Faster? The main reason for the improvement is the appearance of the eager Index Spools, which are also known as index-on-the-fly spools.  If you read my Inside The Optimiser series you might be interested to know that the rule responsible is called JoinToIndexOnTheFly. An eager index spool consumes all rows from the table it sits above, and builds a index suitable for the join to seek on.  Taking the index spool above the #Custs table as an example, it reads all the CustomerID and TerritoryID values with a single scan of the table, and builds an index keyed on CustomerID.  The term ‘eager’ means that the spool consumes all of its input rows when it starts up.  The index is built in a work table in tempdb, has no associated statistics, and only exists until the query finishes executing. The result is that each unindexed table is only scanned once, and just for the columns necessary to build the temporary index.  From that point on, every execution of the inner side of the join is answered by a seek on the temporary index – not the base table. A second optimization is that the sort on ProductMainID (required by the ORDER BY clause) is performed early, on just the rows coming from the #OrdDetail table.  The optimizer has a good estimate for the number of rows it needs to sort at that stage – it is just the cardinality of the table itself.  The accuracy of the estimate there is important because it helps determine the memory grant given to the sort operation.  Nested loops join preserves the order of rows on its outer input, so sorting early is safe.  (Hash joins do not preserve order in this way, of course). The extra lazy spool on the #Prods branch is a further optimization that avoids executing the seek on the temporary index if the value being joined (the ‘outer reference’) hasn’t changed from the last row received on the outer input.  It takes advantage of the fact that rows are still sorted on ProductMainID, so if duplicates exist, they will arrive at the join operator one after the other. The optimizer is quite conservative about introducing index spools into a plan, because creating and dropping a temporary index is a relatively expensive operation.  It’s presence in a plan is often an indication that a useful index is missing. I want to stress that I rewrote the query in this way primarily as an educational exercise – I can’t imagine having to do something so horrible to a production system. Improving the Hash Join I promised I would return to the solution that uses hash joins.  You might be puzzled that SQL Server can create three new indexes (and perform all those nested loops iterations) faster than it can perform three hash joins.  The answer, again, is down to the poor information available to the optimizer.  Let’s look at the hash join plan again: Two of the hash joins have single-row estimates on their build inputs.  SQL Server fixes the amount of memory available for the hash table based on this cardinality estimate, so at run time the hash join very quickly runs out of memory. This results in the join spilling hash buckets to disk, and any rows from the probe input that hash to the spilled buckets also get written to disk.  The join process then continues, and may again run out of memory.  This is a recursive process, which may eventually result in SQL Server resorting to a bailout join algorithm, which is guaranteed to complete eventually, but may be very slow.  The data sizes in the example tables are not large enough to force a hash bailout, but it does result in multiple levels of hash recursion.  You can see this for yourself by tracing the Hash Warning event using the Profiler tool. The final sort in the plan also suffers from a similar problem: it receives very little memory and has to perform multiple sort passes, saving intermediate runs to disk (the Sort Warnings Profiler event can be used to confirm this).  Notice also that because hash joins don’t preserve sort order, the sort cannot be pushed down the plan toward the #OrdDetail table, as in the nested loops plan. Ok, so now we understand the problems, what can we do to fix it?  We can address the hash spilling by forcing a different order for the joins: SELECT P.ProductMainID AS PID, P.Name, D.OrderQty, H.SalesOrderNumber, H.OrderDate, C.TerritoryID FROM #Prods P JOIN #Custs C JOIN #OrdHeader H ON H.CustomerID = C.CustomerID JOIN #OrdDetail D ON D.SalesOrderID = H.SalesOrderID ON P.ProductMainID = D.ProductMainID AND P.ProductSubID = D.ProductSubID AND P.ProductSubSubID = D.ProductSubSubID ORDER BY D.ProductMainID OPTION (MAXDOP 1, HASH JOIN, FORCE ORDER); With this plan, each of the inputs to the hash joins has a good estimate, and no hash recursion occurs.  The final sort still suffers from the one-row estimate problem, and we get a single-pass sort warning as it writes rows to disk.  Even so, the query runs to completion in three or four seconds.  That’s around half the time of the previous hashing solution, but still not as fast as the nested loops trickery. Final Thoughts SQL Server’s optimizer makes cost-based decisions, so it is vital to provide it with accurate information.  We can’t really blame the performance problems highlighted here on anything other than the decision to use completely unindexed tables, and not to allow the creation of additional statistics. I should probably stress that the nested loops solution shown above is not one I would normally contemplate in the real world.  It’s there primarily for its educational and entertainment value.  I might perhaps use it to demonstrate to the sceptical that SQL Server itself is crying out for an index. Be sure to read Brad’s original post for more details.  My grateful thanks to him for granting permission to reuse some of his material. Paul White Email: [email protected] Twitter: @PaulWhiteNZ

    Read the article

  • How to Use USER_DEFINED Activity in OWB Process Flow

    - by Jinggen He
    Process Flow is a very important component of Oracle Warehouse Builder. With Process Flow, we can create and control the ETL process by setting all kinds of activities in a well-constructed flow. In Oracle Warehouse Builder 11gR2, there are 28 kinds of activities, which fall into three categories: Control activities, OWB specific activities and Utility activities. For more information about Process Flow activities, please refer to OWB online doc. Most of those activities are pre-defined for some specific use. For example, the Mapping activity allows execution an OWB mapping in Process Flow and the FTP activity allows an interaction between the local host and a remote FTP server. Besides those activities for specific purposes, the User Defined activity enables you to incorporate into a Process Flow an activity that is not defined within Warehouse Builder. So the User Defined activity brings flexibility and extensibility to Process Flow. In this article, we will take an amazing tour of using the User Defined activity. Let's start. Enable execution of User Defined activity Let's start this section from creating a very simple Process Flow, which contains a Start activity, a User Defined activity and an End Success activity. Leave all parameters of activity USER_DEFINED unchanged except that we enter /tmp/test.sh into the Value column of the COMMAND parameter. Then let's create the shell script test.sh in /tmp directory. Here is the content of /tmp/test.sh (this article is demonstrating a scenario in Linux system, and /tmp/test.sh is a Bash shell script): echo Hello World! > /tmp/test.txt Note: don't forget to grant the execution privilege on /tmp/test.sh to OS Oracle user. For simplicity, we just use the following command. chmod +x /tmp/test.sh OK, it's so simple that we’ve almost done it. Now deploy the Process Flow and run it. For a newly installed OWB, we will come across an error saying "RPE-02248: For security reasons, activity operator Shell has been disabled by the DBA". See below. That's because, by default, the User Defined activity is DISABLED. Configuration about this can be found in <ORACLE_HOME>/owb/bin/admin/Runtime.properties: property.RuntimePlatform.0.NativeExecution.Shell.security_constraint=DISABLED The property can be set to three different values: NATIVE_JAVA, SCHEDULER and DISBALED. Where NATIVE_JAVA uses the Java 'Runtime.exec' interface, SCHEDULER uses a DBMS Scheduler external job submitted by the Control Center repository owner which is executed by the default operating system user configured by the DBA. DISABLED prevents execution via these operators. We enable the execution of User Defined activity by setting: property.RuntimePlatform.0.NativeExecution.Shell.security_constraint= NATIVE_JAVA Restart the Control Center service for the change of setting to take effect. cd <ORACLE_HOME>/owb/rtp/sql sqlplus OWBSYS/<password of OWBSYS> @stop_service.sql sqlplus OWBSYS/<password of OWBSYS> @start_service.sql And then run the Process Flow again. We will see that the Process Flow completes successfully. The execution of /tmp/test.sh successfully generated a file /tmp/test.txt, containing the line Hello World!. Pass parameters to User Defined Activity The Process Flow created in the above section has a drawback: the User Defined activity doesn't accept any information from OWB nor does it give any meaningful results back to OWB. That's to say, it lacks interaction. Maybe, sometimes such a Process Flow can fulfill the business requirement. But for most of the time, we need to get the User Defined activity executed according to some information prior to that step. In this section, we will see how to pass parameters to the User Defined activity and pass them into the to-be-executed shell script. First, let's see how to pass parameters to the script. The User Defined activity has an input parameter named PARAMETER_LIST. This is a list of parameters that will be passed to the command. Parameters are separated from one another by a token. The token is taken as the first character on the PARAMETER_LIST string, and the string must also end in that token. Warehouse Builder recommends the '?' character, but any character can be used. For example, to pass 'abc,' 'def,' and 'ghi' you can use the following equivalent: ?abc?def?ghi? or !abc!def!ghi! or |abc|def|ghi| If the token character or '\' needs to be included as part of the parameter, then it must be preceded with '\'. For example '\\'. If '\' is the token character, then '/' becomes the escape character. Let's configure the PARAMETER_LIST parameter as below: And modify the shell script /tmp/test.sh as below: echo $1 is saying hello to $2! > /tmp/test.txt Re-deploy the Process Flow and run it. We will see that the generated /tmp/test.txt contains the following line: Bob is saying hello to Alice! In the example above, the parameters passed into the shell script are static. This case is not so useful because: instead of passing parameters, we can directly write the value of the parameters in the shell script. To make the case more meaningful, we can pass two dynamic parameters, that are obtained from the previous activity, to the shell script. Prepare the Process Flow as below: The Mapping activity MAPPING_1 has two output parameters: FROM_USER, TO_USER. The User Defined activity has two input parameters: FROM_USER, TO_USER. All the four parameters are of String type. Additionally, the Process Flow has two string variables: VARIABLE_FOR_FROM_USER, VARIABLE_FOR_TO_USER. Through VARIABLE_FOR_FROM_USER, the input parameter FROM_USER of USER_DEFINED gets value from output parameter FROM_USER of MAPPING_1. We achieve this by binding both parameters to VARIABLE_FOR_FROM_USER. See the two figures below. In the same way, through VARIABLE_FOR_TO_USER, the input parameter TO_USER of USER_DEFINED gets value from output parameter TO_USER of MAPPING_1. Also, we need to change the PARAMETER_LIST of the User Defined activity like below: Now, the shell script is getting input from the Mapping activity dynamically. Deploy the Process Flow and all of its necessary dependees then run the Process Flow. We see that the generated /tmp/test.txt contains the following line: USER B is saying hello to USER A! 'USER B' and 'USER A' are two outputs of the Mapping execution. Write the shell script within Oracle Warehouse Builder In the previous section, the shell script is located in the /tmp directory. But sometimes, when the shell script is small, or for the sake of maintaining consistency, you may want to keep the shell script inside Oracle Warehouse Builder. We can achieve this by configuring these three parameters of a User Defined activity properly: COMMAND: Set the path of interpreter, by which the shell script will be interpreted. PARAMETER_LIST: Set it blank. SCRIPT: Enter the shell script content. Note that in Linux the shell script content is passed into the interpreter as standard input at runtime. About how to actually pass parameters to the shell script, we can utilize variable substitutions. As in the following figure, ${FROM_USER} will be replaced by the value of the FROM_USER input parameter of the User Defined activity. So will the ${TO_USER} symbol. Besides the custom substitution variables, OWB also provide some system pre-defined substitution variables. You can refer to the online document for that. Deploy the Process Flow and run it. We see that the generated /tmp/test.txt contains the following line: USER B is saying hello to USER A! Leverage the return value of User Defined activity All of the previous sections are connecting the User Defined activity to END_SUCCESS with an unconditional transition. But what should we do if we want different subsequent activities for different shell script execution results? 1.  The simplest way is to add three simple-conditioned out-going transitions for the User Defined activity just like the figure below. In the figure, to simplify the scenario, we connect the User Defined activity to three End activities. Basically, if the shell script ends successfully, the whole Process Flow will end at END_SUCCESS, otherwise, the whole Process Flow will end at END_ERROR (in our case, ending at END_WARNING seldom happens). In the real world, we can add more complex and meaningful subsequent business logic. 2.  Or we can utilize complex conditions to work with different results of the User Defined activity. Previously, in our script, we only have this line: echo ${FROM_USER} is saying hello to ${TO_USER}! > /tmp/test.txt We can add more logic in it and return different values accordingly. echo ${FROM_USER} is saying hello to ${TO_USER}! > /tmp/test.txt if CONDITION_1 ; then ...... exit 0 fi if CONDITION_2 ; then ...... exit 2 fi if CONDITION_3 ; then ...... exit 3 fi After that we can leverage the result by checking RESULT_CODE in condition expression of those out-going transitions. Let's suppose that we have the Process Flow as the following graph (SUB_PROCESS_n stands for more different further processes): We can set complex condition for the transition from USER_DEFINED to SUB_PROCESS_1 like this: Other transitions can be set in the same way. Note that, in our shell script, we return 0, 2 and 3, but not 1. As in Linux system, if the shell script comes across a system error like IO error, the return value will be 1. We can explicitly handle such a return value. Summary Let's summarize what has been discussed in this article: How to create a Process Flow with a User Defined activity in it How to pass parameters from the prior activity to the User Defined activity and finally into the shell script How to write the shell script within Oracle Warehouse Builder How to do variable substitutions How to let the User Defined activity return different values and in what way can we leverage

    Read the article

  • top tweets WebLogic Partner Community – October 2012

    - by JuergenKress
    Send your tweets @wlscommunity #WebLogicCommunity and follow us at http://twitter.com/wlscommunity WebLogic Community?@wlscommunity Real World Java EE Patterns by Adam Bien http://wp.me/p1LMIb-mp Markus Eisele?@myfear #JavaOne Content Available for Free https://blogs.oracle.com/java/entry/javaone_content_available_for_free … /via @java Adam Bien?@AdamBien Thought that 1h screencast is way too long to be popular. I was wrong. Lightweight Java EE is doing very well: http://www.adam-bien.com/roller/abien/entry/lightweight_java_ee_screencast … OracleBlogs?@OracleBlogs COLLABORATE 13 Call for Papers http://ow.ly/2szPuZ Oracle WebLogic?@OracleWebLogic New Blog Post: Data Source Security Part 1 http://ow.ly/2szFbv Markus Eisele?@myfear My Three Days at #JavaOne 2012 http://yakovfain.com/2012/10/04/my-three-days-at-javaone-2012/ … < nice writeup ;) Adam Bien?@AdamBien JavaOne 2012 Announcements And Surprises: NetBeans 7.3+ comes with HTML 5, JavaScript, CSS 3 support. JavaScript... http://bit.ly/Uy14eD Andrejus Baranovskis?@andrejusb OOW'12: Oracle ADF Implementations Around the Globe: Best Practices http://fb.me/1IVg6gzU0 gschmutz?@gschmutz Just published a blog with a wrap-up of my presentations at OOW 2012. https://guidoschmutz.wordpress.com/2012/10/07/my-presentations-at-oracle-open-world-2012/ … #oow2012 #trivadis Andrejus Baranovskis?@andrejusb OOW'12: Oracle Business Process Management/Oracle ADF Integration Best Practices http://fb.me/1GY3nz1lb WebLogic Community?@wlscommunity ExaLogic 2.01 ppt & training & Installation check-list & tips & Web tier roadmap http://wp.me/p1LMIb-mh Adam Bien?@AdamBien JavaOne 2012, First Feedback and The Strange Thing: NetBeans day was surprising well attended. A big room was fu... http://bit.ly/PwWwx8 OracleSupport_WLS?@weblogicsupport Free registration for our next webcast on setting up and using a #weblogic #cluster http://pub.vitrue.com/xWV8 WebLogic Community?@wlscommunity UKOUG Application Server & Middleware SIG Meeting http://wp.me/p1LMIb-mC Ronald Luttikhuizen?@rluttikhuizen Discussing future plans for Oracle Middleware Infrastructure Group with @simon_haslam @Jphjulstad and Rene van Wijk #oow @wlscommunity JAX London?@jaxlondon Be part of #JAXLondon- only 11 days to go! Still need a ticket? http://buff.ly/TUPKmL WebLogic Community?@wlscommunity ExaLogic X3-2 launched at OOW 2012 http://wp.me/p1LMIb-mM WebLogic Community?@wlscommunity @OracleEvents Dear Oracle Team thanks for promoting the WebLogic bootcamp, new schedules are online https://blogs.oracle.com/emeapartnerweblogic/resource/weblogic12c.htm … #weblogiccommunity OracleBlogs?@OracleBlogs Partner Webcast Introducing Oracle Business Activity Monitoring - 18 October 2012 http://ow.ly/2svzyz AMIS, Oracle & Java?@AMIS_Services Grant posted a nice little video on youtube about the #ADF EMG activities during Oracle Open World. http://youtu.be/qZhtBqnK-Zc GlassFish?@glassfish ADF Essentials - Available for free and certified on GlassFish!: If you are an Oracle customer, you are probably... http://bit.ly/UCtVwY OracleBlogs?@OracleBlogs WebLogic 12 hands-on bootcamps for partnersnew dates & locations http://ow.ly/2smOfs Pieter Kranenburg?@pskranenburg I'm EXA and I know IT! How about you? Go to http://bit.ly/OnSlDd and find out! (you might win an #iphone5 ;-) #OOW please RT Andrejus Baranovskis?@andrejusb Enabling WebLogic Administrator Group Inside Custom ADF Application http://fb.me/2d5SCeJ2g Michel Schildmeijer?@MNEMONIC01 I'm EXA and I know IT! How about you? Go to http://bit.ly/OnSlDd (you might win an #iphone5 ;-) #oow OracleSupport_WLS?@weblogicsupport Step-by-step instructions on how to configure mail Alerts in #OEM 11g for #WebLogic Servers up/down status http://pub.vitrue.com/KpZq Jeff West?@jeffreyawest Answer: Deliver JMS message to a single node in a Weblogic Cluster with a Distributed Topic http://stackoverflow.com/a/12396492/697114?stw=2 … Java?@java Bucharest Java User Group: Launched and Growing! #JUG http://ow.ly/dDnbN OracleSupport_WLS?@weblogicsupport Don't shoot the messenger! #Java source code analyzer @ http://pub.vitrue.com/Cy2J JAX London?@jaxlondon .@BrianGoetz gives in depth session on the details of how #Lambda expressions are implemented in the #Java language at #JAXLondon" ADF Community DE?@ADFCommunityDE Webcast ADFNewsSession: ADF as a basis of Fusion Apps - the biggest ADF project ever. Sep 14, 8:30 AM CET. Dial in https://blogs.oracle.com/jdevotnharvest/entry/adf_partner_community_news_session … OracleBlogs?@OracleBlogs WebLogic & Coherence & Cloud presentations for customer meetings http://ow.ly/1mqwrC Pieter Kranenburg?@pskranenburg Seminar: Oracle WebLogic 12c at Qualogy. You are invited! http://bit.ly/Ps9LDF Oracle WebLogic?@OracleWebLogic New Blog Post: Oracle OpenWorld Update -- General Session: Oracle Fusion Middleware Strategies Driving Business Inno... http://ow.ly/2stylf Oracle Cloud Zone?@OracleCloudZone New partner programs for Oracle Cloud Solutions http://bit.ly/PrVq5O #cloud #oow Lucas Jellema?@lucasjellema The strategy on Java - JEE, SE, ME, FX: http://technology.amis.nl/2012/10/02/javaone-2012-strategy-and-technical-keynote/ … #javaone #oow_amis WebLogic Community?@wlscommunity Send your #WebLogicCommunity #oow pictures and blog posts @wlscommunity or http://www.facebook.com/weblogiccommunity … Enjoy OOW ;-) WebLogic Community?@wlscommunity Become an WebLogic 12c expert, attend our partner bootcampshttps://blogs.oracle.com/emeapartnerweblogic/resource/weblogic12c.htm … #WebLogicCommunity #opn AMIS, Oracle & Java?@AMIS_Services Volgende #oracle #ADF training bij @AMIS_SERVICES is van 12 tot 16 november. Meer info of aanmelden? http://www.amis.nl/Trainingen/oracle-adf-11g-applicatieontwikkeling/ … Devoxx?@Devoxx ALL the Devoxx 2011 talks are now freely available on Parleys @ http://www.parleys.com/#st=4&id=102998 Pls RT! Adam Bien?@AdamBien Use the coupon code "PLUMA" and you will get 20% off for "Real World Java EE Patterns": http://realworldpatterns.com Lucas Jellema?@lucasjellema Very good summary of the #JavaOne Technical Keynote last night: http://java.dzone.com/articles/javaone-2012-javaone-technical … Arun Gupta?@arungupta Blogged: JavaOne 2012 Keynote and GlassFish Party Pictures: Some pictures from the keynote ... And som... http://bit.ly/ViH0ue Lucas Jellema?@lucasjellema Most recent promoted build for GassFish 4.0 (EE7) has WebSocket support: to play with: http://dlc.sun.com.edgesuite.net/glassfish/4.0/promoted/ … #javaone michael palmeter?@michaelpalmeter If you haven't seen the 5-minute Exalogic demo, you need to (do it now!) - http://lnkd.in/GRqy3x Lonneke Dikmans?@lonnekedikmans VENNSTER BLOG: Running EclipseLink DBWS 2.4.0 on GlassFish 3.1.2 http://blog.vennster.nl/2012/09/running-eclipselink-dbws-240-on.html?spref=tw … WebLogic Community?@wlscommunity WebLogic Partner Community Newsletter September 2012 http://wp.me/p1LMIb-mf WebLogic Community?@wlscommunity again again again&hellip;. it is Oracle Open World 2012 http://wp.me/p1LMIb-m6 Markus Eisele?@myfear #WebLogic and #JavaEE Roadmap and Strategy Session at OOW http://ow.ly/2slZEY /via @OracleWebLogic Adam Bien?@AdamBien An Article About Java EE Connector Architectures 1.6 (JCA 1.6): The free Java Magazine article: Java EE Connect... http://bit.ly/St6sxq Lucas Jellema?@lucasjellema ADF Essentials - free to develop and to deploy (I said: free!) - http://www.oracle.com/technetwork/developer-tools/adf/overview/adfessentials-1719844.html … AMIS, Oracle & Java?@AMIS_Services Blog by Lucas Jellema: "Develop and Deploy ADF applications – free of charge using the new ADF Essentials" http://bit.ly/StAhxY Andrejus Baranovskis?@andrejusb ADF Essentials - Quick Technical Review http://fb.me/2hKCXyF43 OracleBlogs?@OracleBlogs GlassFish Extension for Oracle JDeveloper http://ow.ly/2slIO8 Retweetet von WebLogic Community Oracle Eclipse?@OEPE New Tutorial: Using ADF Faces and ADF Controller with Oracle Enterprise Pack for Eclipse. #OEPE http://pub.vitrue.com/QoUg Simon Haslam?@simon_haslam As of the last day or two there's a new Java Products Media Pack on http://edelivery.oracle.com (rather than it being in FMW pack) WebLogic Community?@wlscommunity top tweets WebLogic Partner Community &ndash; September 2012 http://wp.me/p1LMIb-m2 Adam Bien?@AdamBien I was interviewed by OTN: http://www.oracle.com/technetwork/articles/java/jaxawards-1843595.html …See you at JavaOne! Oracle WebLogic?@OracleWebLogic DevOps Basics for #WebLogic: Track Down High CPU Thread with ps, top and the new JDK7 jcmd tool. Great blog @frankmuz. http://ow.ly/dOBM4 Simon Haslam?@simon_haslam Looking for "oak style"(!) advanced content but you're a middleware specialist? See #ukoug2012 #middlewaresunday http://2012.ukoug.org/default.asp?p=9355 … Julien Ponge ?@jponge Just finished Java EE 6 + AngularJS samples for my upcoming middleware lectures. Code at https://github.com/jponge/todoapp-javaee6-angularjs … and https://github.com/jponge/todoapp-bosswatch … Markus Eisele?@myfear #Oracle #WebLogic is now totally #FREE for #Developer - more than just OTN license to develop the 1st prototype! http://bit.ly/SWltsR Markus Eisele?@myfear #WebSockets on #WebLogic Server http://ow.ly/1mv4QP by @wlsteve < need to give this a testdrive ;) OracleEnterpriseMgr?@oracle_em EM Blog : Oracle Enterprise Manager Cloud Control 12c Release 2 (12.1.0.2) is Available Now ! #em12c http://pub.vitrue.com/mk7o OracleBlogs?@OracleBlogs ADF training material now on the iPad http://ow.ly/1mqz1Q GlassFish?@glassfish GlassFish grows by 50% in Software Stack Market Share Report for August 2012 by @Jelastic http://awe.sm/o4ZAp WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: twitter,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Community Conversation

    - by ultan o'broin
    Applications User Experience members (Erika Webb, Laurie Pattison, and I) attended the User Assistance Europe Conference in Stockholm, Sweden. We were impressed with the thought leadership and practical application of ideas in Anne Gentle's keynote address "Social Web Strategies for Documentation". After the conference, we spoke with Anne to explore the ideas further. Applications User Experience Senior Director Laurie Pattison (left) with Anne Gentle at the User Assistance Europe Conference In Anne's book called Conversation and Community: The Social Web for Documentation, she explains how user assistance is undergoing a seismic shift. The direction is away from the old print manuals and online help concept towards a web-based, user community-driven solution using social media tools. User experience professionals now have a vast range of such tools to start and nurture this "conversation": blogs, wikis, forums, social networking sites, microblogging systems, image and video sharing sites, virtual worlds, podcasts, instant messaging, mashups, and so on. That user communities are a rich source of user assistance is not a surprise, but the extent of available assistance is. For example, we know from the Consortium for Service Innovation that there has been an 'explosion' of user-generated content on the web. User-initiated community conversations provide as much as 30 times the number of official help desk solutions for consortium members! The growing reliance on user community solutions is clearly a user experience issue. Anne says that user assistance as conversation "means getting closer to users and helping them perform well. User-centered design has been touted as one of the most important ideas developed in the last 20 years of workplace writing. Now writers can take the idea of user-centered design a step further by starting conversations with users and enabling user assistance in interactions." Some of Anne's favorite examples of this paradigm shift from the world of traditional documentation to community conversation include: * Writer Bob Bringhurst's blog about Adobe InDesign and InCopy products and Adobe's community help * The Microsoft Development Network Community Center * ·The former Sun (now Oracle) OpenDS wiki, NetBeans Ruby and other community approaches to engage diverse audiences using screencasts, wikis, and blogs. * Cisco's customer support wiki, EMC's community, as well as Symantec and Intuit's approaches * The efforts of Ubuntu, Mozilla, and the FLOSS community generally Adobe Writer Bob Bringhurst's Blog Oracle is not without a user community conversation too. Besides the community discussions and blogs around documentation offerings, we have the My Oracle Support Community forums, Oracle Technology Network (OTN) communities, wiki, blogs, and so on. We have the great work done by our user groups and customer councils. Employees like David Haimes are reaching out, and enthusiastic non-employee gurus like Chet Justice (OracleNerd), Floyd Teter and Eddie Awad provide great "how-to" information too. But what does this paradigm shift mean for existing technical writers as users turn away from the traditional printable PDF manual deliverables? We asked Anne after the conference. The writer role becomes one of conversation initiator or enabler. The role evolves, along with the process, as the users define their concept of user assistance and terms of engagement with the product instead of having it pre-determined. It is largely a case now of "inventing the job while you're doing it, instead of being hired for it" Anne said. There is less emphasis on formal titles. Anne mentions that her own title "Content Stacker" at OpenStack; others use titles such as "Content Curator" or "Community Lead". However, the role remains one essentially about communications, "but of a new type--interacting with users, moderating, curating content, instead of sitting down to write a manual from start to finish." Clearly then, this role is open to more than professional technical writers. Product managers who write blogs, developers who moderate forums, support professionals who update wikis, rock star programmers with a penchant for YouTube are ideal. Anyone with the product knowledge, empathy for the user, and flair for relationships on the social web can join in. Some even perform these roles already but do not realize it. Anne feels the technical communicator space will move from hiring new community conversation professionals (who are already active in the space through blogging, tweets, wikis, and so on) to retraining some existing writers over time. Our own research reveals that the established proponents of community user assistance even set employee performance objectives for internal content curators about the amount of community content delivered by people outside the organization! To take advantage of the conversations on the web as user assistance, enterprises must first establish where on the spectrum their community lies. "What is the line between community willingness to contribute and the enterprise objectives?" Anne asked. "The relationship with users must be managed and also measured." Anne believes that the process can start with a "just do it" approach. Begin by reaching out to existing user groups, individual bloggers and tweeters, forum posters, early adopter program participants, conference attendees, customer advisory board members, and so on. Use analytical tools to measure the level of conversation about your products and services to show a return on investment (ROI), winning management support. Anne emphasized that success with the community model is dependent on lowering the technical and motivational barriers so that users can readily contribute to the conversation. Simple tools must be provided, and guidelines, if any, must be straightforward but not mandatory. The conversational approach is one where traditional style and branding guides do not necessarily apply. Tools and infrastructure help users to create content easily, to search and find the information online, read it, rate it, translate it, and participate further in the content's evolution. Recognizing contributors by using ratings on forums, giving out Twitter kudos, conference invitations, visits to headquarters, free products, preview releases, and so on, also encourages the adoption of the conversation model. The move to conversation as user assistance is not free, but there is a business ROI. The conversational model means that customer service is enhanced, as user experience moves from a functional to a valued, emotional level. Studies show a positive correlation between loyalty and financial performance (Consortium for Service Innovation, 2010), and as customer experience and loyalty become key differentiators, user experience professionals cannot explore the model's possibilities. The digital universe (measured at 1.2 million petabytes in 2010) is doubling every 12 to 18 months, and 70 percent of that universe consists of user-generated content (IDC, 2010). Conversation as user assistance cannot be ignored but must be embraced. It is a time to manage for abundance, not scarcity. Besides, the conversation approach certainly sounds more interesting, rewarding, and fun than the traditional model! I would like to thank Anne for her time and thoughts, and recommend that all user assistance professionals read her book. You can follow Anne on Twitter at: http://www.twitter.com/annegentle. Oracle's Acrolinx IQ deployment was used to author this article.

    Read the article

  • Base de Datos Oracle, su mejor opción para reducir costos de IT

    - by Ivan Hassig
    Por Victoria Cadavid Sr. Sales Cosultant Oracle Direct Uno de los principales desafíos en la administración de centros de datos es la reducción de costos de operación. A medida que las compañías crecen y los proveedores de tecnología ofrecen soluciones cada vez más robustas, conservar el equilibrio entre desempeño, soporte al negocio y gestión del Costo Total de Propiedad es un desafío cada vez mayor para los Gerentes de Tecnología y para los Administradores de Centros de Datos. Las estrategias más comunes para conseguir reducción en los costos de administración de Centros de Datos y en la gestión de Tecnología de una organización en general, se enfocan en la mejora del desempeño de las aplicaciones, reducción del costo de administración y adquisición de hardware, reducción de los costos de almacenamiento, aumento de la productividad en la administración de las Bases de Datos y mejora en la atención de requerimientos y prestación de servicios de mesa de ayuda, sin embargo, las estrategias de reducción de costos deben contemplar también la reducción de costos asociados a pérdida y robo de información, cumplimiento regulatorio, generación de valor y continuidad del negocio, que comúnmente se conciben como iniciativas aisladas que no siempre se adelantan con el ánimo de apoyar la reducción de costos. Una iniciativa integral de reducción de costos de TI, debe contemplar cada uno de los factores que  generan costo y pueden ser optimizados. En este artículo queremos abordar la reducción de costos de tecnología a partir de la adopción del que según los expertos es el motor de Base de Datos # del mercado.Durante años, la base de datos Oracle ha sido reconocida por su velocidad, confiabilidad, seguridad y capacidad para soportar cargas de datos tanto de aplicaciones altamente transaccionales, como de Bodegas de datos e incluso análisis de Big Data , ofreciendo alto desempeño y facilidades de administración, sin embrago, cuando pensamos en proyectos de reducción de costos de IT, además de la capacidad para soportar aplicaciones (incluso aplicaciones altamente transaccionales) con alto desempeño, pensamos en procesos de automatización, optimización de recursos, consolidación, virtualización e incluso alternativas más cómodas de licenciamiento. La Base de Datos Oracle está diseñada para proveer todas las capacidades que un área de tecnología necesita para reducir costos, adaptándose a los diferentes escenarios de negocio y a las capacidades y características de cada organización.Es así, como además del motor de Base de Datos, Oracle ofrece una serie de soluciones para optimizar la administración de la información a través de mecanismos de optimización del uso del storage, continuidad del Negocio, consolidación de infraestructura, seguridad y administración automática, que propenden por un mejor uso de los recursos de tecnología, ofrecen opciones avanzadas de configuración y direccionan la reducción de los tiempos de las tareas operativas más comunes. Una de las opciones de la base de datos que se pueden provechar para reducir costos de hardware es Oracle Real Application Clusters. Esta solución de clustering permite que varios servidores (incluso servidores de bajo costo) trabajen en conjunto para soportar Grids o Nubes Privadas de Bases de Datos, proporcionando los beneficios de la consolidación de infraestructura, los esquemas de alta disponibilidad, rápido desempeño y escalabilidad por demanda, haciendo que el aprovisionamiento, el mantenimiento de las bases de datos y la adición de nuevos nodos se lleve e cabo de una forma más rápida y con menos riesgo, además de apalancar las inversiones en servidores de menor costo. Otra de las soluciones que promueven la reducción de costos de Tecnología es Oracle In-Memory Database Cache que permite almacenar y procesar datos en la memoria de las aplicaciones, permitiendo el máximo aprovechamiento de los recursos de procesamiento de la capa media, lo que cobra mucho valor en escenarios de alta transaccionalidad. De este modo se saca el mayor provecho de los recursos de procesamiento evitando crecimiento innecesario en recursos de hardware. Otra de las formas de evitar inversiones innecesarias en hardware, aprovechando los recursos existentes, incluso en escenarios de alto crecimiento de los volúmenes de información es la compresión de los datos. Oracle Advanced Compression permite comprimir hasta 4 veces los diferentes tipos de datos, mejorando la capacidad de almacenamiento, sin comprometer el desempeño de las aplicaciones. Desde el lado del almacenamiento también se pueden conseguir reducciones importantes de los costos de IT. En este escenario, la tecnología propia de la base de Datos Oracle ofrece capacidades de Administración Automática del Almacenamiento que no solo permiten una distribución óptima de los datos en los discos físicos para garantizar el máximo desempeño, sino que facilitan el aprovisionamiento y la remoción de discos defectuosos y ofrecen balanceo y mirroring, garantizando el uso máximo de cada uno de los dispositivos y la disponibilidad de los datos. Otra de las soluciones que facilitan la administración del almacenamiento es Oracle Partitioning, una opción de la Base de Datos que permite dividir grandes tablas en estructuras más pequeñas. Esta aproximación facilita la administración del ciclo de vida de la información y permite por ejemplo, separar los datos históricos (que generalmente se convierten en información de solo lectura y no tienen un alto volumen de consulta) y enviarlos a un almacenamiento de bajo costos, conservando la data activa en dispositivos de almacenamiento más ágiles. Adicionalmente, Oracle Partitioning facilita la administración de las bases de datos que tienen un gran volumen de registros y mejora el desempeño de la base de datos gracias a la posibilidad de optimizar las consultas haciendo uso únicamente de las particiones relevantes de una tabla o índice en el proceso de búsqueda. Otros factores adicionales, que pueden generar costos innecesarios a los departamentos de Tecnología son: La pérdida, corrupción o robo de datos y la falta de disponibilidad de las aplicaciones para dar soporte al negocio. Para evitar este tipo de situaciones que pueden acarrear multas y pérdida de negocios y de dinero, Oracle ofrece soluciones que permiten proteger y auditar la base de datos, recuperar la información en caso de corrupción o ejecución de acciones que comprometan la integridad de la información y soluciones que permitan garantizar que la información de las aplicaciones tenga una disponibilidad de 7x24. Ya hablamos de los beneficios de Oracle RAC, para facilitar los procesos de Consolidación y mejorar el desempeño de las aplicaciones, sin embrago esta solución, es sumamente útil en escenarios dónde las organizaciones de quieren garantizar una alta disponibilidad de la información, ante fallo de los servidores o en eventos de desconexión planeada para realizar labores de mantenimiento. Además de Oracle RAC, existen soluciones como Oracle Data Guard y Active Data Guard que permiten replicar de forma automática las bases de datos hacia un centro de datos de contingencia, permitiendo una recuperación inmediata ante eventos que deshabiliten por completo un centro de datos. Además de lo anterior, Active Data Guard, permite aprovechar la base de datos de contingencia para realizar labores de consulta, mejorando el desempeño de las aplicaciones. Desde el punto de vista de mejora en la seguridad, Oracle cuenta con soluciones como Advanced security que permite encriptar los datos y los canales a través de los cueles se comparte la información, Total Recall, que permite visualizar los cambios realizados a la base de datos en un momento determinado del tiempo, para evitar pérdida y corrupción de datos, Database Vault que permite restringir el acceso de los usuarios privilegiados a información confidencial, Audit Vault, que permite verificar quién hizo qué y cuándo dentro de las bases de datos de una organización y Oracle Data Masking que permite enmascarar los datos para garantizar la protección de la información sensible y el cumplimiento de las políticas y normas relacionadas con protección de información confidencial, por ejemplo, mientras las aplicaciones pasan del ambiente de desarrollo al ambiente de producción. Como mencionamos en un comienzo, las iniciativas de reducción de costos de tecnología deben apalancarse en estrategias que contemplen los diferentes factores que puedan generar sobre costos, los factores de riesgo que puedan acarrear costos no previsto, el aprovechamiento de los recursos actuales, para evitar inversiones innecesarias y los factores de optimización que permitan el máximo aprovechamiento de las inversiones actuales. Como vimos, todas estas iniciativas pueden ser abordadas haciendo uso de la tecnología de Oracle a nivel de Base de Datos, lo más importante es detectar los puntos críticos a nivel de riesgo, diagnosticar las proporción en que están siendo aprovechados los recursos actuales y definir las prioridades de la organización y del área de IT, para así dar inicio a todas aquellas iniciativas que de forma gradual, van a evitar sobrecostos e inversiones innecesarias, proporcionando un mayor apoyo al negocio y un impacto significativo en la productividad de la organización. Más información http://www.oracle.com/lad/products/database/index.html?ssSourceSiteId=otnes 1Fuente: Market Share: All Software Markets, Worldwide 2011 by Colleen Graham, Joanne Correia, David Coyle, Fabrizio Biscotti, Matthew Cheung, Ruggero Contu, Yanna Dharmasthira, Tom Eid, Chad Eschinger, Bianca Granetto, Hai Hong Swinehart, Sharon Mertz, Chris Pang, Asheesh Raina, Dan Sommer, Bhavish Sood, Marianne D'Aquila, Laurie Wurster and Jie Zhang. - March 29, 2012 2Big Data: Información recopilada desde fuentes no tradicionales como blogs, redes sociales, email, sensores, fotografías, grabaciones en video, etc. que normalmente se encuentran de forma no estructurada y en un gran volumen

    Read the article

< Previous Page | 158 159 160 161 162 163 164 165 166  | Next Page >