Search Results

Search found 26810 results on 1073 pages for 'fixed point'.

Page 672/1073 | < Previous Page | 668 669 670 671 672 673 674 675 676 677 678 679  | Next Page >

  • Convert JSON String to Java Object or HashMap

    - by Priyank
    Hi. I am writing an android app. I want to pass some data across the intents/activities and I feel that a conversion to and from JSON is probably a more optimal way at this point. I am able to convert a java hashmap to a json string successfully using JSONObject support. However i need to convert back this JSON string to a java object or a hashmap. What is the best way to go about it. Is parcelable really a change worth doing; if I have simple 5 field object? What are the other ways to transfer data between intents. Cheers

    Read the article

  • ClickOnce Deployment online questions

    - by David Archer
    Hi all, Bit of a strange question, but how do ClickOnce deployments work from a web site? I seem to be having some problems with this. Basically, the setup file will download when you click the "install" button, but then some files are missing. Do you need to be on a Microsoft server to run ClickOnce deployments? I usually do deployments over a local server with UNC, and as this is the first time I've done one online I'm struggling a bit. Any newbie tutorials you can point me to would be great, and if I do need a special host for it, could you please recommend some? Cheers!

    Read the article

  • loading data from file into 2d array

    - by Chris
    I am just starting with perl and would like some help with arrays please. I am reading lines from a data file and splitting the line into fields: open (INFILE, $infile); do { my $linedata = <INFILE>; my @data= split ',',$linedata; .... } until eof; I then want to store the individual field values (in @data) in and array so that the array looks like the input data file ie, the first "row" of the array contains the first line of data from INFILE etc. Each line of data from the infile contains 4 values, x,y,z and w and once the data are all in the array, I have to pass the array into another program which reads the x,y,z,w and displays the w value on a screen at the point determined by the x,y,z value. I can not pas the data to the other program on a row-by-row basis as the program expects the data to in a 2d matrtix format. Any help greatly appreciated. Chris

    Read the article

  • Mock Repository vs. Real Repository w/Mocked Data

    - by n8wrl
    I must be doing something fundamentally wrong. I am implmenting my repositories and then testing them with mocked data. All is well. Now I want to test my domain objects so I point them at mock repositories. But I'm finding that I have to re-implement logic from the 'real' repositories into the mocks, or, create 'helper classes' that encapsulate the logic and interact with the repositories (real or mock), and then I have to test those too. So what am I missing - why implement and test mock repositories when I could use the real ones with mocked data? EDIT: To clarify, by 'mocked data' I do not hit the actual database. I have a 'DB mock layer' I can insert under the real repositories that returns known-data.

    Read the article

  • Project Management Helps AmeriCares Deliver International Aid

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss Handle with Care Sound project management helps AmeriCares bring international aid to those in need. The stakes are always high for AmeriCares. On a mission to restore health and save lives during times of disaster, the nonprofit international relief and humanitarian aid organization delivers donated medicines, medical supplies, and humanitarian aid to people in the U.S. and around the globe. Founded in 1982 with the express mission of responding as quickly and efficiently as possible to help people in need, the Stamford, Connecticut-based AmeriCares has delivered more than US$10.5 billion in aid to 147 countries over the past three decades. Launch the Slideshow “It’s critically important to us that we steward all the donations and that the medical supplies and medicines get to people as quickly as possible with no loss,” says Kate Sears, senior vice president for finance and technology at AmeriCares. “Whether we’re shipping IV solutions to victims of cholera in Haiti or antibiotics to Somali famine victims, we need to get the medicines there sooner because it means more people will be helped and lives improved or even saved.” Ten years ago, the tracking systems used by AmeriCares associates were paper-based. In recent years, staff started using spreadsheets, but the tracking processes were not standardized between teams. “Every team was tracking completely different information,” says Megan McDermott, senior associate, Sub-Saharan Africa partnerships, at AmeriCares. “It was just a few key things. For example, we tracked the date a shipment was supposed to arrive and the date we got reports from our partner that a hospital received aid on their end.” While the data was accurate, much detail was being lost in the process. AmeriCares management knew it could do a better job of tracking this enterprise data and in 2011 took a significant step by implementing Oracle’s Primavera P6 Professional Project Management. “It’s a comprehensive solution that has helped us improve the monitoring and controlling processes. It has allowed us to do our distribution better,” says Sears. In addition, the implementation effort has been a change agent, helping AmeriCares leadership rethink project management across the entire organization. Initially, much of the focus was on standardizing processes, but staff members also learned the importance of thinking proactively to prevent possible problems and evaluating results to determine if goals and objectives are truly being met. Such data about process efficiency and overall results is critical not only to AmeriCares staff but also to the donors supporting the organization’s life-saving missions. Efficiency Saves Lives One of AmeriCares’ core operations is to gather product donations from the private sector, establish where the most-urgent needs are, and solicit monetary support to send the aid via ocean cargo or airlift to welfare- and health-oriented nongovernmental organizations, hospitals, health networks, and government ministries based in areas in need. In 2011 alone, AmeriCares sent more than 3,500 shipments to 95 countries in response to both ongoing humanitarian needs and more than two dozen emergencies, including deadly tornadoes and storms in the U.S. and the devastating tsunami in Japan. When it comes to nonprofits in general, donors want to know that the charitable organizations they support are using funds wisely. Typically, nonprofits are evaluated by donors in terms of efficiency, an area where AmeriCares has an excellent reputation: 98 percent of expenses go directly to supporting programs and less than 2 percent represent administrative and fundraising costs. Donors, however, should look at more than simple efficiency, says Peter York, senior partner and chief research and learning officer at TCC Group, a nonprofit consultancy headquartered in New York, New York. They should also look at whether organizations have the systems in place to sustain their missions and continue to thrive. An expert on nonprofit organizational management, York has spent years studying sustainable charitable organizations. He defines them as nonprofits that are able to achieve the ongoing financial support to stay relevant and continue doing core mission work. In his analysis of well over 2,500 larger nonprofits, York has found that many are not sustaining, and are actually scaling back in size. “One of the biggest challenges of nonprofit sustainability is the general public’s perception that every dollar donated has to go only to the delivery of service,” says York. “What our data shows is that there are some fundamental capacities that have to be there in order for organizations to sustain and grow.” York’s research highlights the importance of data-driven leadership at successful nonprofits. “You’ve got to have the tools, the systems, and the technologies to get objective information on what you do, the people you serve, and the results you’re achieving,” says York. “If leaders don’t have the knowledge and the data, they can’t make the strategic decisions about programs to take organizations to the next level.” Historically, AmeriCares associates have used time-tested and cost-effective strategies to ship and then track supplies from donation to delivery to their destinations in designated time frames. When disaster strikes, AmeriCares ships by air and generally pulls out all the stops to deliver the most urgently needed aid within the first few days and weeks. Then, as situations stabilize, AmeriCares turns to delivering sea containers for the postemergency and ongoing aid so often needed over the long term. According to McDermott, getting a shipment out the door is fairly complicated, requiring as many as five different AmeriCares teams collaborating together. The entire process can take months—from when products are received in the warehouse and deciding which recipients to allocate supplies to, to getting customs and governmental approvals in place, actually shipping products, and finally ensuring that the products are received in-country. Delivering that aid is no small affair. “Our volume exceeds half a billion dollars a year worth of donated medicines and medical supplies, so it’s a sizable logistical operation to bring these products in and get them out to the right place quickly to have the most impact,” says Sears. “We really pride ourselves on our controls and efficiencies.” Adding to that complexity is the fact that the longer it takes to deliver aid, the more dire the human need can be. Any time AmeriCares associates can shave off the complicated aid delivery process can translate into lives saved. “It’s really being able to track information consistently that will help us to see where are the bottlenecks and where can we work on improving our processes,” says McDermott. Setting a Standard Productivity and information management improvements were key objectives for AmeriCares when staff began the process of implementing Oracle’s Primavera solution. But before configuring the software, the staff needed to take the time to analyze the systems already in place. According to Greg Loop, manager of database systems at AmeriCares, the organization received guidance from several consultants, including Rich D’Addario, consulting project manager in the Primavera Global Business Unit at Oracle, who was instrumental in shepherding the critical requirements-gathering phase. D’Addario encouraged staff to begin documenting shipping processes by considering the order in which activities occur and which ones are dependent on others to get accomplished. This exercise helped everyone realize that to be more efficient, they needed to keep track of shipments in a more standard way. “The staff didn’t recognize formal project management methodology,” says D’Addario. “But they did understand what the most important things are and that if they go wrong, an entire project can go off course.” Before, if a boatload of supplies was being sent to Haiti and there was a problem somewhere, a lot of time was taken up finding out where the problem was—because staff was not tracking things in a standard way. As a result, even more time was needed to find possible solutions to the problem and alert recipients that the aid might be delayed. “For everyone to put on the project manager hat and standardize the way every single thing is done means that now the whole organization is on the same page as to what needs to occur from the time a hurricane hits Haiti and when a boat pulls in to unload supplies,” says D’Addario. With so much care taken to put a process foundation firmly in place, configuring the Primavera solution was actually quite simple. Specific templates were set up for different types of shipments, and dashboards were implemented to provide executives with clear overviews of every project in the system. AmeriCares’ Loop reports that system planning, refining, and testing, followed by writing up documentation and training, took approximately four months. The system went live in spring 2011 at AmeriCares’ Connecticut headquarters. While the nonprofit has an international presence, with warehouses in Europe and offices in Haiti, India, Japan, and Sri Lanka, most donated medicines come from U.S. entities and are shipped from the U.S. out to the rest of the world. In addition, all shipments are tracked from the U.S. office. AmeriCares doesn’t expect the Primavera system to take months off the shipping time, especially for sea containers. However, any time saved is still important because it will allow aid to be delivered to people more quickly at a lower overall cost. “If we can trim a day or two here or there, that can translate into lives that we’re saving, especially in emergency situations,” says Sears. A Cultural Change Beyond the measurable benefits that come with IT-driven process improvement, AmeriCares management is seeing a change in culture as a result of the Primavera project. One change has been treating every shipment of aid as a project, and everyone involved with facilitating shipments as a project manager. “This is a revolutionary concept for us,” says McDermott. “Before, we were used to thinking we were doing logistics—getting a container from point A to point B without looking at it as one project and really understanding what it meant to manage it.” AmeriCares staff is also happy to report that collaboration within the organization is much more efficient. When someone creates a shipment in the Primavera system, the same shared template is used, which means anyone can log in to the system to see the status of a shipment. Knowledgeable staff can access a shipment project to help troubleshoot a problem. Management can easily check the status of projects across the organization. “Dashboards are really useful,” says McDermott. “Instead of going into the details of each project, you can just see the high-level real-time information at a glance.” The new system is helping team members focus on proactively managing shipments rather than simply reacting when problems occur. For example, when a container is shipped, documents must be included for customs clearance. Now, the shipping template has built-in reminders to prompt team members to ask for copies of these documents from freight forwarders and to follow up with partners to discover if a shipment is on time. In the past, staff may not have worked on securing these documents until they’d been notified a shipment had arrived in-country. Another benefit of capturing and adopting best practices within the Primavera system is that staff training is easier. “Capturing the processes in documented steps and milestones allows us to teach new staff members how to do their jobs faster,” says Sears. “It provides them with the knowledge of their predecessors so they don’t have to keep reinventing the wheel.” With the Primavera system already generating positive results, management is eager to take advantage of advanced capabilities. Loop is working on integrating the company’s proprietary inventory management system with the Primavera system so that when logistics or warehousing operators input data, the information will automatically go into the Primavera system. In the past, this information had to be manually keyed into spreadsheets, often leading to errors. Mining Historical Data Another feature on the horizon for AmeriCares is utilizing Primavera P6 Professional Project Management reporting capabilities. As the system begins to include more historical data, management soon will be able to draw on this information to conduct analysis that has not been possible before and create customized reports. For example, at the beginning of the shipment process, staff will be able to use historical data to more accurately estimate how long the approval process should take for a particular country. This could help ensure that food and medicine with limited shelf lives do not get stuck in customs or used beyond their expiration dates. The historical data in the Primavera system will also help AmeriCares with better planning year to year. The nonprofit’s staff has always put together a plan at the beginning of the year, but this has been very challenging simply because it is impossible to predict disasters. Now, management will be able to look at historical data and see trends and statistics as they set current objectives and prepare for future need. In addition, this historical data will provide AmeriCares management with the ability to review year-end data and compare actual project results with goals set at the beginning of the year—to see if desired outcomes were achieved and if there are areas that need improvement. It’s this type of information that is so valuable to donors. And, according to York, project management software can play a critical role in generating the data to help nonprofits sustain and grow. “It is important to invest in systems to help replicate, expand, and deliver services,” says York. “Project management software can help because it encourages nonprofits to examine program or service changes and how to manage moving forward.” Sears believes that AmeriCares donors will support the return on investment the organization will achieve with the Primavera solution. “It won’t be financial returns, but rather how many more people we can help for a given dollar or how much more quickly we can respond to a need,” says Sears. “I think donors are receptive to such arguments.” And for AmeriCares, it is all about the future and increasing results. The project management environment currently may be quite simple, but IT staff plans to expand the complexity and functionality as the organization grows in its knowledge of project management and the goals it wants to achieve. “As we use the system over time, we’ll continue to refine our best practices and accumulate more data,” says Sears. “It will advance our ability to make better data-driven decisions.”

    Read the article

  • How to work with NServiceBus in gateway mode

    - by Mike737
    I've been trying to get the pubsub sample in the NServiceBus download to work in a gateway mode. I haven't really been able to find out much detail at all about how to get NServiceBus to run in gateway mode. How do I setup the publisher/server in gateway mode? When I did try I received an access denied exception which would either be due to the account I'm running it under or I'm missing something. How do I setup the subscribers/clients to communicate to the gateway? Can anyone point me in the right direction?

    Read the article

  • Will ExtJS die?

    - by Stefan Kendall
    I look at ExtJS, and it appears to provide many of the RIA features that more bulky suites such as Flex provide, without the flash requirement. However, as Open-source initiatiatives such as jQuery-UI continue, will ExtJS simply die at some point? Furthermore, since flash penetration only continues to increase, why put stock in a javascript library? That said, JavaScript libraries such as jQuery have made gigantic leaps in providing easy-to-use APIs with great functionality, so maybe there's some merit in that. Thoughts? Opinions? ExtJS has a price tag, so I have to ask this question.

    Read the article

  • Java object caching, which is faster, reading from a file or from a remote machine?

    - by Kumar225
    I am at a point where I need to take the decision on what to do when caching of objects reaches the configured threshold. Should I store the objects in a indexed file (like provided by JCS) and read them from the file (file IO) when required or have the object stored in a distributed cache (network, serialization, deserialization) We are using Solaris as OS. ============================ Adding some more information. I have this question so as to determine if I can switch to distributed caching. The remote server which will have cache will have more memory and better disk and this remote server will only be used for caching. One of the problems we cannot increase the locally cached objects is , it stores the cached objects in JVM heap which has limited memory(using 32bit JVM). ======================================================================== Thanks, we finally ended up choosing Coherence as our Cache product. This provides many cache configuration topologies, in process vs remote vs disk ..etc.

    Read the article

  • How does one inject variables into page templates from a custom Drupal module?

    - by Michael T. Smith
    We've created a custom module for organizing and publishing our newsletter content. The issue I'm running into now -- and I'm new to theming and Drupal module development, so it could just be a knowledge issue as opposed to a Drupal issue -- is how to get each newsletter themed. At this point the URL structure of our newsletter will be: /newsletters/{newsletter-name}/{edition-name}/{issue-date} which means that we can create template files in our theme using filenames like page-newsletters-{newsletter-name}-{edition-name}.tpl.php, which is great. The one issue I'm running into is that all of the content comes through in the $content variable of the theme. I'd like to have it come through as different variables (so that I can, inside the theme, place certain content in certain areas.) Is there a proper way for doing this?

    Read the article

  • How to JBoss/Blazeds clustering and channel failover

    - by Francesco
    Hi, I'm stuck with jboss and blazeds clusterization. What I have now is : 2 Jboss Instances, running in all mode One load balancer with apache and mod_jk, as suggested by Jboss docs A spring/flex integration app A flex application that I do not want to throw errors when one of my JBoss instances falls I find Adobe documentation really lacking, and being new at clustering, jgroups and balancing I cannot find how to deploy my app in clustered environment. From what I understood if configured correctly blazeds should tell flex client about failover servers upon connection. Then if flex client can't connect to the main server it goes to another. But the hard part for me is getting there. Can someone point me to the right direction? Thanks in advance

    Read the article

  • Convert BLOB to Jpg using C#

    - by TGuimond
    Hello, I am currently developing a web application that receives data from an on-site oracle database. The database developers have developed some web-services that I am able to call and send/receive data to and from the the database. When I want to display an image the method returns the image in BLOB format. My question is: what is the best way to convert BLOB to .jpg or .bmp so I can display the image correctly? If someone could point me in the right direction that would be great! Cheers, Tristan

    Read the article

  • ASP.NET CustomValidator trying to match to System.EventHandler?

    - by annakata
    I have markup so: <asp:TextBox runat="server" ID="Accountname" /> <asp:CustomValidator runat="server" ControlToValidate="Accountname" OnServerValidate="Accountname_CheckUnique" meta:resourcekey="ACCOUNTNAME_UNAVAILABLE" /> Codebehind so: protected void Accountname_CheckUnique(object source, ServerValidateEventArgs arguments) { arguments.IsValid = Foo(); } Which was working just fine, and then without changing anything on the page ASP now insists: No overload for 'Accountname_CheckUnique' matches delegate 'System.EventHandler' Well no, and nor should it according to MSDN. It's late and I'm tired, anybody know how to fix this or point out the glaring flaw in my comprehension?

    Read the article

  • WCF - WebReferences not working

    - by JMSA
    At the client end, I have generated a Proxy using SvcUtil.exe and it is working fine. Then I have added a WebReference to the client assembly and calling the same method. But it is not working. My program is running in console mode and the method is suppose to return a string. It is not returning the string. I just see a blank console window. No exception is thrown. And after setting a debug point on the method call I see that, program is halted on the method call for ever. What should I look for to solve the problem? I am using VS2005. And adding the webReference by right-clicking the client project and then clicking "Add Web Reference" pop-up menu.

    Read the article

  • How to create a container that holds different types of function pointers in C++?

    - by Alex
    I'm doing a linear genetic programming project, where programs are bred and evolved by means of natural evolution mechanisms. Their "DNA" is basically a container (I've used arrays and vectors successfully) which contain function pointers to a set of functions available. Now, for simple problems, such as mathematical problems, I could use one type-defined function pointer which could point to functions that all return a double and all take as parameters two doubles. Unfortunately this is not very practical. I need to be able to have a container which can have different sorts of function pointers, say a function pointer to a function which takes no arguments, or a function which takes one argument, or a function which returns something, etc (you get the idea)... Is there any way to do this using any kind of container ? Could I do that using a container which contains polymorphic classes, which in their turn have various kinds of function pointers? I hope someone can direct me towards a solution because redesigning everything I've done so far is going to be painful.

    Read the article

  • Getting website data into Adobe InDesign

    - by Magnus Smith
    I'd like our magazine team to be able to download website data in a file that Adobe InDesign can read. They can then import/open the file, make a few tweaks, and cut out a vast deal of repetitive manual labour (they currently use copy&paste for a few hours). After a brief Google I note that v2 of InDesign can import/export XML so perhaps that is my best bet? Are there any alternatives, and can anyone offer any advice on them? I am using a PC, and the magazine team are on Macs; testing will be tiresome I fear. The data we wish to format is fairly simple - a title followed by a short chunk of text (repeated about 50 times, say). I'll ask about importing images later. Thanks for your help. I will return to Google now, but it would be great if anyone can point me in a more specific direction first!

    Read the article

  • Custom component which displays voice recognition button if available

    - by steff
    Hi evereyone, I'd like to create a custom component which supports voice recognition. It will primarily be an extended EditText which should show the microphone button for voice recognition if it is available. I wanted to to look at the search app-widget on the homescreen but I don't find it in the source. This is intended to use the voice recognition as some sort of dictation device, i.e. the user does not have to type but use his voice instead. So could anyone please point me in some direction? Thanks in advance, Steff

    Read the article

  • OOP Design Question - Where/When do you Validate properties?

    - by JW
    I have read a few books on OOP DDD/PoEAA/Gang of Four and none of them seem to cover the topic of validation - it seems to be always assumed that data is valid. I gather from the answers to this post (http://stackoverflow.com/questions/1651964/oop-design-question-validating-properties) that a client should only attempt to set a valid property value on a domain object. This person has asked a similar question that remains unanswered: http://bytes.com/topic/php/answers/789086-php-oop-setters-getters-data-validation#post3136182 So how do you ensure it is valid? Do you have a 'validator method' alongside every getter and setter? isValidName() setName() getName() I seem to be missing some key basic knowledge about OOP data validation - can you point me to a book that covers this topic in detail? - ie. covering different types of validation / invariants/ handling feedback / to use Exceptions or not etc

    Read the article

  • jquery-ui from a bookmarklet

    - by Mala
    Hi I'm trying to create a bookmarklet which will use the jquery-ui - what steps must I take to do this? My bookmarklet loads a remote script and appends it to the page. This script includes jquery and jquery-ui, but getting the stylesheets and images is turning into a nightmare. Do I have to edit all the CSS to point images to their locations on the remote server? Effectively what I'm trying to do is have a bookmarklet which will pop up some information in a nice pretty draggable jquery-ui dialog. Is there an easier way? Thanks, Mala

    Read the article

  • Difference between "traditional" COM and COM+ (in Component Services)

    - by kizzx2
    By the "traditional" way I mean registering the DLL in registry. There seems to be another method to set up it by going to mmc-Component Services-COM+ Applications and adding the .tlb file. I have a COM library that supports both methods. When it installs, it registers itself in the registry as a COM component and it works fine. However, when I added the .tlb file using the Component Services method, the behavior seems to be different and it starts giving out errors. I suspect it has something to do with marshaling and inter-process object transfer? (Sorry, I'm really a noob in the COM area) Can anyone point me to a good resource to clear my understanding?

    Read the article

  • iPad split controller that doesn't hide the left pane in portrait

    - by Tim Norman
    I am trying to implement a split view controller like UISplitViewController on the iPad, but I don't want the left pane to be hidden when the device is in portrait orientation. So I've created a UIViewController subclass for this in IB and it works fine without any sub-view controllers. Now I'm trying to wrap my head around what is required to setup and manage the two UIViewController objects for the left and right panes. In my app, they are going to both be UINavigationController with a UITableView in them. I've hit a mental road block about how to set this up and was hoping someone could point me to some sample code or give me a recommendation for architecture here...

    Read the article

  • Parsing JPEG file format: Format of entropy-coded segment (ECS) segments?

    - by me2
    I'm having difficulty understanding the ITU-T T.81 spec for the JPEG file format. Hopefully someone else here has tried to parse JPEG files and/or knows about the details of this file format. The spec indiates that the ECS0 segment starts after the SOS segment but I can't find where in the spec it actually talks about the format of the ECS0 segment or how do detect its start. Simple JPEG implementations online are of limited help because they assume many things about the JPEGs they parse. Can anyone point me in the right direction?

    Read the article

  • Silverlight MEF Embedded Resources

    - by Fastidious
    I have two different Silverlight UserControls imported with MEF from two different xaps. The UserControls are simply an Image on a Canvas. Both UserControls have the image marked as 'Resource'. The images are different but their names are the same (key point). I'm not quite sure what's going on behind the scenes of the MEF import but both images seem to end up in the same AppDomain. After the composition when I stick the UserControls on a Canvas, each is an instance of the class it should be, but they both show the same image. Obviously if the image file names are unique across all xaps I import I have no problem but I don't like that solution. Is there a better one?

    Read the article

  • SQL Table stored as a Heap - the dangers within

    - by MikeD
    Nearly all of the time I create a table, I include a primary key, and often that PK is implemented as a clustered index. Those two don't always have to go together, but in my world they almost always do. On a recent project, I was working on a data warehouse and a set of SSIS packages to import data from an OLTP database into my data warehouse. The data I was importing from the business database into the warehouse was mostly new rows, sometimes updates to existing rows, and sometimes deletes. I decided to use the MERGE statement to implement the insert, update or delete in the data warehouse, I found it quite performant to have a stored procedure that extracted all the new, updated, and deleted rows from the source database and dump it into a working table in my data warehouse, then run a stored proc in the warehouse that was the MERGE statement that took the rows from the working table and updated the real fact table. Use Warehouse CREATE TABLE Integration.MergePolicy (PolicyId int, PolicyTypeKey int, Premium money, Deductible money, EffectiveDate date, Operation varchar(5)) CREATE TABLE fact.Policy (PolicyKey int identity primary key, PolicyId int, PolicyTypeKey int, Premium money, Deductible money, EffectiveDate date) CREATE PROC Integration.MergePolicy as begin begin tran Merge fact.Policy as tgtUsing Integration.MergePolicy as SrcOn (tgt.PolicyId = Src.PolicyId) When not matched by Target then Insert (PolicyId, PolicyTypeKey, Premium, Deductible, EffectiveDate)values (src.PolicyId, src.PolicyTypeKey, src.Premium, src.Deductible, src.EffectiveDate) When matched and src.Operation = 'U' then Update set PolicyTypeKey = src.PolicyTypeKey,Premium = src.Premium,Deductible = src.Deductible,EffectiveDate = src.EffectiveDate When matched and src.Operation = 'D' then Delete ;delete from Integration.WorkPolicy commit end Notice that my worktable (Integration.MergePolicy) doesn't have any primary key or clustered index. I didn't think this would be a problem, since it was relatively small table and was empty after each time I ran the stored proc. For one of the work tables, during the initial loads of the warehouse, it was getting about 1.5 million rows inserted, processed, then deleted. Also, because of a bug in the extraction process, the same 1.5 million rows (plus a few hundred more each time) was getting inserted, processed, and deleted. This was being sone on a fairly hefty server that was otherwise unused, and no one was paying any attention to the time it was taking. This week I received a backup of this database and loaded it on my laptop to troubleshoot the problem, and of course it took a good ten minutes or more to run the process. However, what seemed strange to me was that after I fixed the problem and happened to run the merge sproc when the work table was completely empty, it still took almost ten minutes to complete. I immediately looked back at the MERGE statement to see if I had some sort of outer join that meant it would be scanning the target table (which had about 2 million rows in it), then turned on the execution plan output to see what was happening under the hood. Running the stored procedure again took a long time, and the plan output didn't show me much - 55% on the MERGE statement, and 45% on the DELETE statement, and table scans on the work table in both places. I was surprised at the relative cost of the DELETE statement, because there were really 0 rows to delete, but I was expecting to see the table scans. (I was beginning now to suspect that my problem was because the work table was being stored as a heap.) Then I turned on STATS_IO and ran the sproc again. The output was quite interesting.Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'Policy'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'MergePolicy'. Scan count 1, logical reads 433276, physical reads 60, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. I've reproduced the above from memory, the details aren't exact, but the essential bit was the very high number of logical reads on the table stored as a heap. Even just doing a SELECT Count(*) from Integration.MergePolicy incurred that sort of output, even though the result was always 0. I suppose I should research more on the allocation and deallocation of pages to tables stored as a heap, but I haven't, and my original assumption that a table stored as a heap with no rows would only need to read one page to answer any query was definitely proven wrong. It's likely that some sort of physical defragmentation of the table may have cleaned that up, but it seemed that the easiest answer was to put a clustered index on the table. After doing so, the execution plan showed a cluster index scan, and the IO stats showed only a single page read. (I aborted my first attempt at adding a clustered index on the table because it was taking too long - instead I ran TRUNCATE TABLE Integration.MergePolicy first and added the clustered index, both of which took very little time). I suspect I may not have noticed this if I had used TRUNCATE TABLE Integration.MergePolicy instead of DELETE FROM Integration.MergePolicy, since I'm guessing that the truncate operation does some rather quick releasing of pages allocated to the heap table. In the future, I will likely be much more careful to have a clustered index on every table I use, even the working tables. Mike  

    Read the article

  • Installing unsigned x64 driver to work with libusbdotnet

    - by user216194
    Hi all- I am currently in a Windows 7 dev. environment working to get a device to initialize with libusbdotnet. The device (a USB mass storage device) connects and runs using the default USB-MASS Storage driver for Windows. I want to replace this driver with the one created by the .INF Wizard in libusbdotnet. The operating system is a 64-bit, and by default the INF Wizard produces this driver, but I am unable to selected it because it is "unsigned" I believed, when I go to "Pick from a list of drivers" and point to the directory where the newly created device drivers are. I have enabled "TEST MODE" using DESO but I'm still unable to select this file. Anyone familiar with libusbdotnet, or directing devices to work with a specific driver that is unsigned in Window (do I need the .inf file? or the .sys???) do you have any advice about where I'm going wrong? Thanks!

    Read the article

  • Is there a HIPAA certification that would be helpful to programmers?

    - by Aequitarum Custos
    I work at a Medical Billing company that's software handles many facets of the centers practice, including patients, treatments, etc. To be frank, the only thing I know about HIPAA is security is paramount. I was wondering if there were any training courses with an earned certification proving knowledge/experience that I could use to increase my worth to the company, and give them a selling point (we have HIPAA certified employees). I did a Google search and found this site, but it does not appear legitimate, and I'm trying to avoid any of those fake certification/degree sites. Does anyone know of any recognized and legitimate (preferably government sponsored) HIPAA certifications useful to programmers? If state is a factor, I live in Oklahoma.

    Read the article

< Previous Page | 668 669 670 671 672 673 674 675 676 677 678 679  | Next Page >