Search Results

Search found 67705 results on 2709 pages for 'time management'.

Page 271/2709 | < Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >

  • Can I delay selection of a file name?

    - by Xavierjazz
    XP SP3 I have my system set up so that I do not need to double click on an item to open it. However, I find that items are selected when I move over them swiftly. This is a problem when I want to save a file and inadvertently pass over another file name. It immediately gets selected. Is there a way to delay this selection for, say, a second so that they are not selected so quickly? Thanks. Regards,

    Read the article

  • Unix: Search for file contents

    - by Svish
    I find the find . -name "some-file" command very useful to list all files matching some file name in a folder. Is there anything similar I can use to list all files that contains string? If you needed to find all files in a directory that had a certain string of text in it, what would you use?

    Read the article

  • File permissions in Windows XP

    - by user23950
    Is there any software that can be installed in Windows XP to set file permissions for guest accounts? So that they would have to input a valid administrator password first before they can access the file? I've seen a feature like this in Ubuntu, wherein even the administrator has to input the password over and over just to access a certain drive. But I need it in Windows XP.

    Read the article

  • Find copies of folders? (Not files)

    - by acidzombie24
    I have a dozen of folders that are duplicates. Within them are a few dozen folders that are duplicates so i have a few thousand copies of the same files and folders. Many of them are exactly the same while others have changes in a few files. What utility can i use to delete folders that are copies of others with no changes? if one or more files in that folder have been changed i dont want it deleted (and i'd like the subfolders to have a shortcuts to a copy but thats not required). Is there a utility to do this?

    Read the article

  • Why are my SharePoint downloads not completing for outside users?

    - by CT
    I am using WSS 3.0 with Microsoft Server 2003. I am running into the following problem. On a pretty frequent basis, outside users are having trouble downloading documents. Some downloads are completing while the download is still incomplete. So for instance, a PDF is a 17MB file. If I download it from within the office, all 17MBs are downloaded and it opens. If I download it from an outside connection, it may download anywhere from 5-10 MB of the file and then say it is complete. When these partial downloads are opened, it gives the user the error, this file is corrupt and cannot be repaired. I have solved this problem on some of the occasions by simply deleting the document and uploading a new copy of the document. This does not always work. Are there known bugs? Are the Internet settings that need to be modified on the outside user's machine? Does anyone else run into this?

    Read the article

  • A methology that allows for a single Java code base covering many different versions?

    - by Thorbjørn Ravn Andersen
    I work in a small shop where we have a LOT of legacy Cobol code and where a methology has been adopted to allow us to minimize forking and branching as much as possible. For a given release we have three levels: CORE - bottom layer, this code is common to all releases GROUP - optional code common to several customers. CUSTOMER - optional code specific for a single customer. When a program is needed, it is first searched for in CUSTOMER, then in GROUP and finally in CORE. A given application for us invokes many programs which all are looked for in this sequence (think exe files and PATH under Windows). We also have Java programs interacting with this legacy code, and as the core-group-customer lookup mehchanism does not lend it self easily to Java it has tended to grow in a CVS branch for each customer, requiring much too much maintainance. The Java part and the backend part tend to be developed in parallel. I have been assigned to figure out a way to make the two worlds meet. Essentially we want a Java enviornment which allows us to have a single code base with sources for each release, where we easily can select a group and a customer and work with the application as it goes for that customer, and then easily switch to another codeset and THAT customer. I was thinking of perhaps a scenario with an Eclipse project for each core, customer, and group and then use Project Sets to select those we need for a given scenario. The problem I cannot get my head about, is how we would create robust code in the CORE projects which will work regardless of which group and customer is selected. A Factory class which knows which sub class of a passed Class object to invoke instead of each and every new? Others must have had similar code base management problems. Anybody with experiences to share? EDIT: The conclusion to this problem above has been that CVS needs to be replaced with a source code management system better suited for dealing with many branches concurrently and the migration of source from one component to the other while keeping history. Inspired by the recent migration by slf4j and logback we are currently looking at git as it handles branches very well. We've considered subversion and mercurial too but git appears to be better for single location, multibranched projects. I've asked about Perforce in another question, but my personal inclination is towards open source solutions for something as crucial as this. EDIT: After some more pondering, we've found that our actual pain point is that we use branches in CVS, and that branches in CVS are the easiest to work with if you branch ALL files! The revised conclusion is that we can do this with CVS alone, by switching to a forest of java projects, each corresponding to one of the levels above, and use the Eclipse build paths to tie them together so each CUSTOMER version pulls in the appropriate GROUP and CORE project. We still want to switch to a better versioning system but this is so important a decision so we want to delay it as much as possible. EDIT: I now have a proof-of-concept implementation of the CORE-GROUP-CUSTOMER concept using Google Guice 2.0 - the @ImplementedBy tag is just what we need. I wonder what everybody else does? Using if's all over the place? EDIT: Now I also need this functionality for web applications. Guice was until the JSR-330 is in place. Anybody with versioning experience? EDIT: JSR-330/299 is now in place with the JEE6 reference implementation Weld based on JBoss Seam and I have reimplemented the proof-of-concept with Weld and can see that if we use @Alternative along with ... in beans.xml we can get the behaviour we desire. I.e. provide a new implementation for a given functionality in CORE without changing a bit in the CORE jars. Initial reading up on the Servlet 3.0 specification indicates that it may support the same functionality for web application resources (not code). We will now do initial testing on the real application.

    Read the article

  • Evidence-Based-Scheduling - are estimations only as accurate as the work-plan they're based on?

    - by Assaf Lavie
    I've been using FogBugz's Evidence Based Scheduling (for the uninitiated, Joel explains) for a while now and there's an inherent problem I can't seem to work around. The system is good at telling me the probability that a given project will be delivered at some date, given the detailed list of tasks that comprise the project. However, it does not take into account the fact that during development additional tasks always pop up. Now, there's the garbage-can approach of creating a generic task/scheduled-item for "last minute hacks" or "integration tasks", or what have you, but that clearly goes against the idea of aggregating the estimates of many small cases. It's often the case that during the development stage of a project you realize that there's a whole area your planning didn't cover, because, well, that's the nature of developing stuff that hasn't been developed before. So now your ~3 month project may very well turn into a 6 month project, but not because your estimations were off (you could be the best estimator in the world, for those task the comprised your initial work plan); rather because you ended up adding a whole bunch of new tasks that weren't there to begin with. EBS doesn't help you with that. It could, theoretically (I guess). It could, perhaps, measure the amount of work you add to a project over time and take that into consideration when estimating the time remaining on a given project. Just a thought. In other words, EBS works on a task basis, but not on a project/release basis - but the latter is what's important. It's what your boss typically cares about - delivery date, not the time it takes to finish each task along the way, and not the time it would have taken, if your planning was perfect. So the question is (yes, there's a question here, don't close it): What's your methodology when it comes to using EBS in FogBugz and how do you solve the problem above, which seems to be a main cause of schedule delays and mispredictions? Edit Some more thoughts after reading a few answers: If it comes down to having to choose which delivery date you're comfortable presenting to your higher-ups by squinting at the delivery-probability graph and choosing 80%, or 95%, or 60% (based on what, exactly?) then we've resorted to plain old buffering/factoring of our estimates. In which case, couldn't we have skipped the meticulous case by case hour-sized estimation effort step? By forcing ourselves to break down tasks that take more than a day into smaller chunks of work haven't we just deluded ourselves into thinking our planning is as tight and thorough as it could be? People may be consistently bad estimators that do not even learn from their past mistakes. In that respect, having an EBS system is certainly better than not having one. But what can we do about the fact that we're not that good in planning as well? I'm not sure it's a problem that can be solved by a similar system. Our estimates are wrong because of tendencies to be overly optimistic/pessimistic about certain tasks, and because of neglect to account for systematic delays (e.g. sick days, major bug crisis) - and usually not because we lack knowledge about the work that needs to be done. Our planning, on the other hand, is often incomplete because we simply don't have enough knowledge in this early stage; and I don't see how an EBS-like system could fill that gap. So we're back to methodology. We need to find a way to accommodate bad or incomplete work plans that's better than voodoo-multiplication.

    Read the article

  • Oracle OpenWorld Recap - A Walk in the Clouds (and heat in San Francisco)!

    - by Di Seghposs
    Whether you were one of the 50,000 attendees in San Francisco or one of the million+ online attendees – we’d like to thank you for joining us at Oracle OpenWorld last week! With temperatures in the 80s and 90s, attendees traveled the overheated streets to join packed keynotes and general sessions – all to find the information they came in search of – Oracle solutions to address their business requirements and challenges. The buzz of this year’s OpenWorld was all about ‘The Cloud’. And, the financial management team joined in the cloud buzz with Thomas Kurian’s keynote which highlighted our ERP Cloud Service as the most complete cloud service on the market. Offering the full breadth of business operations, including Financial Management, Risk and Control Management, Project Portfolio Management, Procurement, Sourcing, and Inventory Management, Oracle ERP Cloud Service transforms the back office into a collaborative, efficient, and intuitive hub. And, our product marketing expert on Financial Management, Annette Melatti, provided a glimpse of what the office of finance looks like in the 21st century as well as shared what’s next for Oracle’s financial solutions discussing the future of Financial Management with Fusion Financials, E-Business Suite, PeopleSoft and the JD Edwards solutions. There were over 120 sessions from customers, partners, and Oracle experts that addressed financial management solutions along with demo pods and Meet the Experts sessions. We hope you found what you were looking for! Missed any of the keynotes or general sessions? Watch them on demand here. At OpenWorld, we also announced that Lending Club, the leading platform for investing in and obtaining personal loans, has selected Oracle ERP Cloud Service to help improve decision-making, implement robust reporting, and take advantage of the cost savings provided by the cloud. The CFO of Lending Club, Carrie Dolan had mentioned that they “are an innovative, data-intensive, high-growth company and needed a solution and partner that could match us. We conducted a thorough review of our options, and Oracle ERP Cloud Service was the clear winner in terms of capabilities and business value as well as commitment to us as a customer.” Read the entire release here. For now, it’s back to business as we gear up for the second half of our fiscal year and start planning for Oracle OpenWorld 2013!

    Read the article

  • juju bootstrap fails with a local environment, why?

    - by Braiam
    Each time I try to bootstrap juju using a local enviroment it fails starting the juju-db-braiam-local script as follows: $ sudo juju --debug --verbose bootstrap 2013-10-20 02:28:53 INFO juju.provider.local environprovider.go:32 opening environment "local" 2013-10-20 02:28:53 DEBUG juju.provider.local environ.go:210 found "10.0.3.1" as address for "lxcbr0" 2013-10-20 02:28:53 DEBUG juju.provider.local environ.go:234 checking 10.0.3.1:8040 to see if machine agent running storage listener 2013-10-20 02:28:53 DEBUG juju.provider.local environ.go:237 nope, start some 2013-10-20 02:28:53 DEBUG juju.environs.tools storage.go:87 Uploading tools for [raring precise] 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:109 looking for: juju 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:150 checking: /usr/bin/jujud 2013-10-20 02:28:53 INFO juju.environs.tools build.go:156 found existing jujud 2013-10-20 02:28:53 INFO juju.environs.tools build.go:166 target: /tmp/juju-tools243949228/jujud 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:217 forcing version to 1.14.1.1 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:37 adding entry: &tar.Header{Name:"FORCE-VERSION", Mode:420, Uid:0, Gid:0, Size:8, ModTime:time.Time{sec:63517832933, nsec:278894120, loc:(*time.Location)(0x108fda0)}, Typeflag:0x30, Linkname:"", Uname:"ubuntu", Gname:"ubuntu", Devmajor:0, Devminor:0, AccessTime:time.Time{sec:63517832933, nsec:278894120, loc:(*time.Location)(0x108fda0)}, ChangeTime:time.Time{sec:63517832933, nsec:278894120, loc:(*time.Location)(0x108fda0)}} 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:37 adding entry: &tar.Header{Name:"jujud", Mode:493, Uid:0, Gid:0, Size:19179512, ModTime:time.Time{sec:63517832933, nsec:274894120, loc:(*time.Location)(0x108fda0)}, Typeflag:0x30, Linkname:"", Uname:"ubuntu", Gname:"ubuntu", Devmajor:0, Devminor:0, AccessTime:time.Time{sec:63517832933, nsec:274894120, loc:(*time.Location)(0x108fda0)}, ChangeTime:time.Time{sec:63517832933, nsec:274894120, loc:(*time.Location)(0x108fda0)}} 2013-10-20 02:28:55 INFO juju.environs.tools storage.go:106 built 1.14.1.1-raring-amd64 (4196kB) 2013-10-20 02:28:55 INFO juju.environs.tools storage.go:112 uploading 1.14.1.1-precise-amd64 2013-10-20 02:28:55 INFO juju.environs.tools storage.go:112 uploading 1.14.1.1-raring-amd64 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:29 reading tools with major version 1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:34 filtering tools by version: 1.14.1.1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:37 filtering tools by series: precise 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:41 reading v1.* tools 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-precise-amd64 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-raring-amd64 2013-10-20 02:28:55 INFO juju.environs.boostrap bootstrap.go:57 bootstrapping environment "local" 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:29 reading tools with major version 1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:34 filtering tools by version: 1.14.1.1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:37 filtering tools by series: precise 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:41 reading v1.* tools 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-precise-amd64 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-raring-amd64 2013-10-20 02:28:55 DEBUG juju.provider.local environ.go:395 create mongo journal dir: /home/braiam/.juju/local/db/journal 2013-10-20 02:28:55 DEBUG juju.provider.local environ.go:401 generate server cert 2013-10-20 02:28:55 INFO juju.provider.local environ.go:421 installing service juju-db-braiam-local to /etc/init 2013-10-20 02:28:56 ERROR juju.provider.local environ.go:423 could not install mongo service: exec ["start" "juju-db-braiam-local"]: exit status 1 (start: Job failed to start) 2013-10-20 02:28:56 ERROR juju supercommand.go:282 command failed: exec ["start" "juju-db-braiam-local"]: exit status 1 (start: Job failed to start) error: exec ["start" "juju-db-braiam-local"]: exit status 1 (start: Job failed to start) What is the reason for this error and how to solve it?

    Read the article

  • Next Fusion CRM Webinar for Partners (Monday March 19th, 3pm GMT): Fusion CRM User Interface, Activity Streams and Opportunity management

    - by Richard Lefebvre
    The next session of our weekly Fusion CRM webinar for EMEA partners will take place Monday March the 19th at 3pm GMT / 4pm CET and will address the Fusion CRM User Interface, Activity Streams and Opportunity management In order to check the complete agenda and see login-details, please visit our dedicated microsite. How to join the dedicated microsite: Click on http://isdportal.oracle.com/isd_html/sf.htm Enter your Email Address in the corresponding field Enter fusion_crm in the “Access URL/Page Token” field Agenda: The list of sessions is published and will be regularly updated in the microsite. Duration: Each session lasts up to 60 minutes Webex: The respective webinar link and session ID are published in the microsite Audio:  The audio call details (telephone numbers by country, call number and password) is indicated in the microsite Slides: For your convenience, a pdf copy of each presentation will be stored in the microsite’s document section. We hope that this series of webcasts will be instrumental to your way of Fusion CRM business success!  For further information please contact me at [email protected]

    Read the article

  • Working as Test Engineer and looking to move into Identity Management Technology. Possible?

    - by Aditi Bhatnagar
    I have been working as Test Engineer for past 2.5 year. The project is related to Identity Management and I am in love with the technology. I want to move into the same field. I don't aim to be a hard core coder but rather an analyst or an IDM architect. Is it realistically possible to do so? I see some possible issues, since the field is fairly new in India and for the time being I don't have coding/deployment experience at all. If it is nonetheless possible to switch into this new technology, what kind of effort I have to put in? What are the possible steps that you can suggest to try this switch?

    Read the article

  • Oracle Applications Day 2012. Experience the Global Innovation of Management Applications

    - by antonella.buonagurio
    1024x768 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} 1024x768 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} 10 ottobre 2012 – Milano, East End Studios | 17 ottobre 2012 - Roma, Officine Farneto Partecipa all’appuntamento dedicato alla comunità di Clienti e Partner per fare networking e condividere le esperienze sulle soluzioni più innovative per affrontare le sfide attuali e future. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} A Milano (10/10/2012) interverranno, tra gli altri:  Enrico Ancona, Amministratore Delegato - Imperia & Monferrina e Business Reply  Massimiliano Gerli, CIO - Amplifon e Michele Paolin, Senior Manager - Deloitte eXtended Business Services A Roma (17/10/2012) interverranno, tra gli altri: Giulio Carone, CFO - Enel Green Power e Claudio Arcudi, Senior Executive - Accenture Gianluca D’Aniello, CIO - Sky e Giorgio Pitruzzello, Manager - Deloitte Consulting Spartaco Parente, EPD Change & Label Control - Abbott e Business Reply Sono inoltre previsti i contributi delle aziende Abbott, Aeroporto di Napoli, Amplifon, Dema Aerospace, Enel Green Power, Fiera Milano, Imperia & Monferrina, La Rinascente, Safilo, Sky, Spal,Technogym, Tiscali e Tivù che parleranno di: Innovation for Human Resources Performance Management Excellence Empower Applications with Technology (Milano) Applications for Public Sector (Roma) Next Generation Global Operations Customer Experience Revolution Oltre dieci Instant Workshop ti permetteranno di conoscere e condividere l’esperienza dei Partner e delle aziende che utilizzano le soluzioni Oracle.In più, oltre dieci Instant Workshop per conoscere e condividere l’esperienza dei Partner e delle aziende che utilizzano con successo le soluzioni Oracle. Iscriviti sul sito Partecipa al concorso fotografico Oracle I.M.A.G.E. e vinci il tuo iPad! Scatta le immagini che per te descrivono i cinque concept dell’evento (Innovation, Management, Applications, Global, Experience) e inviale per e-mail. Per iscriverti al contest visita la pagina Concorso sul sito Non perdere l’evento più “social cool” dell’anno!

    Read the article

  • Why do I have to choose between "management" and "technical" tracks in my career?

    - by Stephen Gross
    I was recently laid off, and although I found a new gig I'm a bit frustrated with how career tracks work in the land of software development. I really love doing a bit of everything: coding, testing, architect(ing), leadership/management, customer contact, requirements gathering, staff development, etc. Software companies, however, want me to fit into a niche: I'm either a coder, a tester, or a manager. When I try to explain to them that I'm best when I'm doing all of those at once, they seem very confused. I'm sympathetic to their interests, but at the same time frustrated that the industry works this way. Any advice? Do I just need to get with the program, so to speak?

    Read the article

  • Oracle Applications Day 2012. Experience the Global Innovation of Management Applications

    - by antonella.buonagurio
    10 ottobre 2012 – Milano, East End Studios | 17 ottobre 2012 - Roma, Officine Farneto Sono aperte le iscrizioni per partecipare agli appuntamenti dedicati alla comunità di Clienti e Partner: un’occasione imperdibile a Milano e Roma per condividere le soluzioni più innovative e le esperienze più significative sulle scelte strategiche per affrontare le sfide attuali e future. Iscriviti sul sito sito Oracle Applications Day 2012 ti offre l’opportunità di partecipare al concorso fotografico Oracle I.M.A.G.E. e vincere un iPad! Scatta le immagini che per te descrivono i cinque concept dell’evento (Innovation, Management, Applications, Global, Experience) ed inviale tramite il tuo smartphone. Per partecipare al contest visita la pagina Concorso sul sito E non dimenticare ORACLE RED! Diventa protagonista della comunicazione visiva dell’Oracle Applications Day: scatta la tua immagine Red Oracle e postala su INSTAGRAM con il cancelletto/hashtag #OracleApps_Red. Le tue foto potranno diventare lo slideshow che verrà proiettato in apertura dei lavori. Per ulteriori informazioni visita il sito Non perdere l’evento più “social cool” dell’anno! Il Team Oracle

    Read the article

  • Oracle Applications Day 2012. Experience the Global Innovation of Management Applications

    - by antonella.buonagurio
    Iscriviti subito all’Oracle Applications Day 2012 e partecipa al concorso fotografico Oracle I.M.A.G.E. Pochi i giorni rimasti per partecipare al CONCORSO, molte le possibilità di vincere il tuo iPad (*)! Hai tempo fino al 5 OTTOBRE per inviare le tue fotografie Oracle I.M.A.G.E. e vincere uno dei 5 iPad(*) in palio per ciascuna delle due città! Non perdere quest’occasione, scatta le immagini che per te descrivono i cinque concept dell’evento e inviale per e-mail a [email protected] indicando: •  nell’oggetto della mail, il tema della fotografia: Innovation, Management, Applications, Global, Experience; •  nel corpo della mail, il tuo nome e cognome e città nella quale parteciperai all’Applications Day 2012 Milano o Roma. 10 ottobre 2012 – Milano, East End Studios | 17 ottobre 2012 - Roma, Officine Farneto L’evento per condividere con Clienti e Partner Oracle le soluzioni più innovative e le esperienze più significative sulle scelte strategiche per affrontare le sfide attuali e future. Iscriviti all’evento sul sito

    Read the article

  • Using components in the XNA Game State Management example?

    - by Zolomon
    In the game state management example at the App Hub, they say that if you want to use components in the example you can extend the GameScreen to host other components inside itself. I'm having a very hard time trying to tie this up. I tried extending the GameScreen class by adding a public property of public List<DrawableGameCompnent> components { get; set; } and then add my components to that list when I initialize the current screen as well as looping over the components in the LoadContent, Update and Draw methods. However, this doesn't feel like the correct way to go - mainly because it doesn't work when I get to the implementation of my GameplayScreen. Any thoughts?

    Read the article

  • Content Management Systems - Why Should I Build My Website With One?

    Why would you want your website built using a Content Management System (CMS)? Well, there are quite a few compelling reasons. A CMS is driven by a database that stores all the content of the website and only delivers pages when called for by the users' browser. The CMS has a "back end" where content is added to the database, and all you need to do it is your browser. This changes everything! It means for the first time a site owner can make changes to their website when they want to and how they want to - read on for more information...

    Read the article

  • MySQL table data transformation -- how can I dis-aggregate MySQL time data?

    - by lighthouse65
    We are coding for a MySQL data warehousing application that stores descriptive data (User ID, Work ID, Machine ID, Start and End Time columns in the first table below) associated with time and production quantity data (Output and Time columns in the first table below) upon which aggregate (SUM, COUNT, AVG) functions are applied. We now wish to dis-aggregate time data for another type of analysis. Our current data table design: +---------+---------+------------+---------------------+---------------------+--------+------+ | User ID | Work ID | Machine ID | Event Start Time | Event End Time | Output | Time | +---------+---------+------------+---------------------+---------------------+--------+------+ | 080025 | ABC123 | M01 | 2008-01-24 16:19:15 | 2008-01-24 16:34:45 | 2120 | 930 | +---------+---------+------------+---------------------+---------------------+--------+------+ Reprocessing dis-aggregation that we would like to do would be to transform table content based on a granularity of minutes, rather than the current production event ("Event Start Time" and "Event End Time") granularity. The resulting reprocessing of existing table rows would look like: +---------+---------+------------+---------------------+--------+ | User ID | Work ID | Machine ID | Production Minute | Output | +---------+---------+------------+---------------------+--------+ | 080025 | ABC123 | M01 | 2010-01-24 16:19 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:20 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:21 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:22 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:23 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:24 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:25 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:26 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:27 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:28 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:29 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:30 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:31 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:22 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:33 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:34 | 133 | +---------+---------+------------+---------------------+--------+ So the reprocessing would take an existing row of data created at the granularity of production event and modify the granularity to minutes, eliminating redundant (Event End Time, Time) columns while doing so. It assumes a constant rate of production and divides output by the difference in minutes plus one to populate the new table's Output column. I know this can be done in code...but can it be done entirely in a MySQL insert statement (or otherwise entirely in MySQL)? I am thinking of a INSERT ... INTO construction but keep getting stuck. An additional complexity is that there are hundreds of machines to include in the operation so there will be multiple rows (one for each machine) for each minute of the day. Any ideas would be much appreciated. Thanks.

    Read the article

  • MySQL table data transformation -- how can I dis-aggreate MySQL time data?

    - by lighthouse65
    We are coding for a MySQL data warehousing application that stores descriptive data (User ID, Work ID, Machine ID, Start and End Time columns in the first table below) associated with time and production quantity data (Output and Time columns in the first table below) upon which aggregate (SUM, COUNT, AVG) functions are applied. We now wish to dis-aggregate time data for another type of analysis. Our current data table design: +---------+---------+------------+---------------------+---------------------+--------+------+ | User ID | Work ID | Machine ID | Event Start Time | Event End Time | Output | Time | +---------+---------+------------+---------------------+---------------------+--------+------+ | 080025 | ABC123 | M01 | 2008-01-24 16:19:15 | 2008-01-24 16:34:45 | 2120 | 930 | +---------+---------+------------+---------------------+---------------------+--------+------+ Reprocessing dis-aggregation that we would like to do would be to transform table content based on a granularity of minutes, rather than the current production event ("Event Start Time" and "Event End Time") granularity. The resulting reprocessing of existing table rows would look like: +---------+---------+------------+---------------------+--------+ | User ID | Work ID | Machine ID | Production Minute | Output | +---------+---------+------------+---------------------+--------+ | 080025 | ABC123 | M01 | 2010-01-24 16:19 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:20 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:21 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:22 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:23 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:24 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:25 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:26 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:27 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:28 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:29 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:30 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:31 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:22 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:33 | 133 | | 080025 | ABC123 | M01 | 2010-01-24 16:34 | 133 | +---------+---------+------------+---------------------+--------+ So the reprocessing would take an existing row of data created at the granularity of production event and modify the granularity to minutes, eliminating redundant (Event End Time, Time) columns while doing so. It assumes a constant rate of production and divides output by the difference in minutes plus one to populate the new table's Output column. I know this can be done in code...but can it be done entirely in a MySQL insert statement (or otherwise entirely in MySQL)? I am thinking of a INSERT ... INTO construction but keep getting stuck. An additional complexity is that there are hundreds of machines to include in the operation so there will be multiple rows (one for each machine) for each minute of the day. Any ideas would be much appreciated. Thanks.

    Read the article

  • Why do many software projects fail today?

    - by TomTom
    As long as there are software projects, the world is wondering why they fail so often. I would like to know if there is a list or something equivalent which shows how many software projects fail today. Would be nice if there would be a comparison over the last 20 - 30 years. You can also add your top reason why a software project fails. Mine is "Requirements are poor or not even existing." which includes also "No (real) customer / user involved". EDIT: It is nearly impossible to clearly define the term "fail". Let's say that fail means: The project was more than 10% over budget and time. In my opinion the 10% + / - is a good range for an offer / tender. EDIT: Until now (Feb 11) it seems that most posters agree that a fail of the project is basically a failure of the project management (whatever fail means). But IMHO it comes out, that most developers are not happy with this situation. Perhaps because not the manager get penalized when a project was not successful, but the lazy, incompetent developer teams? When I read the posts I can also hear-out that there is a big "gap" between the developer side and the managment side. The expectations (perhaps also the requirements) seem to be so different, that a project cannot be successful in the end (over time / budget; users are not happy; not all first-prio features implemented; too many bugs because developers were forced to implement in too short timeframes ...) I',m asking myself: How can we improve it? Or do we have the possibility to improve it? Everybody seems to be unsatisfied with the way it goes now. Can we close the gap between these two worlds? Should we (the developers) go on strike and fight for "high quality reqiurements" and "realistic / iteration based time shedules"? EDIT: Ralph Westphal and Stefan Lieser have founded a new "community" called: Clean-Code-Developer. The aim of the group is to bring more professionalism into software engineering. Independently if a developer has a degree or tons of years of experience you can be part of this movement. Clean Code Developers live principles like SOLID every day. A professional developer is the biggest reviewer of his own work. And he has an internal value system which helps him to improve and become better. Check it out on: Clean Code Developer EDIT: Our company is doing at the moment a thing called "Application Development and Maintenance Benchmarking". This is a service offered by IBM to get a feedback from someone external on your software engineering process quality etc. When we get the results, I will tell you more about it.

    Read the article

  • Why does my .NET Windows service not start automatically sometimes?

    - by Tomek
    Hi all, I have modified a working Windows service that had always been starting beforehand. After adding the System.Management reference it now sometimes will not start automatically. I get the following error: Service cannot be started. System.Runtime.InteropServices.COMException (0x80010002): Call was canceled by the message filter. (Exception from HRESULT: 0x80010002 (RPC_E_CALL_CANCELED)) I found another post here on SO with someone having the same issue. http://stackoverflow.com/questions/998883/why-wont-my-net-windows-service-start-automatically-after-a-reboot However, the proposed solution was to have the service start after the services it depends on have started. However, when I go to the Dependencies tab for my service, I see: Should I just use the workaround method of putting the thread to sleep, or is there a more proper way of getting this service to start correctly? Is this happening because .NET has not started before my service starts? Thanks, Tomek

    Read the article

< Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >