Search Results

Search found 18329 results on 734 pages for 'interpret order'.

Page 226/734 | < Previous Page | 222 223 224 225 226 227 228 229 230 231 232 233  | Next Page >

  • Welch's Juices-up Its Inventory Management with Oracle Supply Chain

    - by [email protected]
    Supply & Demand Chain Executive published recently a great success story about Welch's implementation of "Take Supply Chain and G.SI to work with Oracle Process Manufacturing". The company says it's been able to improve operational control, inventory accuracy, visibility and order fulfillment by automating its processes across three production/warehousing locations nationwide. Improving warehouse and inventory management operations creates efficiencies across a high-velocity nationwide supply chain Welch's production facilities were collecting more information than ever before on the flow of materials and inventory, but the company needed an effective and accurate method to organize and manage these data.   Article found at: http://www.sdcexec.com/publication/article.jsp?pubId=1&id=12256&pageNum=2     

    Read the article

  • Welch's Juices-up Its Inventory Management with Oracle Supply Chain

    - by [email protected]
    Supply & Demand Chain Executive published recently a great success story about Welch's implementation of "Take Supply Chain and G.SI to work with Oracle Process Manufacturing". The company says it's been able to improve operational control, inventory accuracy, visibility and order fulfillment by automating its processes across three production/warehousing locations nationwide. Improving warehouse and inventory management operations creates efficiencies across a high-velocity nationwide supply chain Welch's production facilities were collecting more information than ever before on the flow of materials and inventory, but the company needed an effective and accurate method to organize and manage these data.   Article found at: http://www.sdcexec.com/publication/article.jsp?pubId=1&id=12256&pageNum=2     

    Read the article

  • Welch's Juices-up Its Inventory Management with Oracle Supply Chain

    - by [email protected]
    Supply & Demand Chain Executive published recently a great success story about Welch's implementation of "Take Supply Chain and G.SI to work with Oracle Process Manufacturing". The company says it's been able to improve operational control, inventory accuracy, visibility and order fulfillment by automating its processes across three production/warehousing locations nationwide. Improving warehouse and inventory management operations creates efficiencies across a high-velocity nationwide supply chain Welch's production facilities were collecting more information than ever before on the flow of materials and inventory, but the company needed an effective and accurate method to organize and manage these data.   Article found at: http://www.sdcexec.com/publication/article.jsp?pubId=1&id=12256&pageNum=2     

    Read the article

  • I cannot rename files in bulk using ubuntu's rename feature

    - by user254174
    I cannot rename files in bulk using ubuntu's rename feature. The files are on a NTFS partition. I want to rename files that look like this: whatever pic george.jpg tacoma narrows bridge.jpg green bottle.jpg to: filename (1) filename (2) filename (3) And I cannot do this at all. I don't want to use the command line either. So I can permanently erase files after I have encrypted them without exposing their contents to people who use a file recovery tool. I also don't want a method that takes days or months to rename the file. That is, rename one file at a time. So if I have hundreds of files to rename, this won't be a option. I want to give a each file the same name and numbered in order like shown above. Pyrenamer is not an option for me, unless you can find how to do that in PyRenamer.

    Read the article

  • OWA for ios devices

    - by marc dekeyser
    Originally posted on: http://geekswithblogs.net/marcde/archive/2013/07/23/owa-for-ios-devices.aspxI was in the presentation launch of the OWA for ios devices and boy, does that look exciting! We now feature a full app for Office 365 supporting OWA offline and many more options. Support for Exchange 2013 on premise deployments is not there yet but is planned to come soon (when it's ready!)"Our goal is to help our customers remain productive anytime, anywhere.  This includes providing a great email experience on smartphones and tablets.  Windows Phone 8 comes with a top-notch native email client in Outlook Mobile, and we offer Exchange ActiveSync (EAS), which is the de-facto industry standard for accessing Exchange email on mobile devices.  In order to better support many of our customers who use their iPhones and iPads for work, we are introducing OWA for iPhone and OWA for iPad, which bring a native Outlook Web App experience to iOS devices!"Read more: http://blogs.office.com/b/office365tech/archive/2013/07/16/owa-for-iphone-and-owa-for-ipad.aspx

    Read the article

  • Why did Apple remove Python support in Mavericks, aka Mac OS X 10.9?

    - by alex gray
    Apple has removed Python support (at least on the Developer level) in 10.9. Python IS still on the machine in /System/Library/Frameworks/Python.framework... but trying to link to Python using the 10.9 SDK fails. /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk/System/Library/Frameworks does not have Python. I'm not a Pythonista, but find it interesting that Apple has made this change. I don't understand why this is done and I'm a bit annoyed that I have to remove Python from my compilation units in order to compile with 10.9 SDK. Is this a statement by Apple, along the lines of "People aren't using Python very much anymore so we're going to phase out support"? Or was something else driving the change?

    Read the article

  • Monitoring almost anything with BizTalk 360

    - by Michael Stephenson
    When you work in an integration environment it is common that you will find yourself in a situation where you integrate with some unusual applications or have some unusual dependencies. That is the nature of integration. When you work with BizTalk one of the common problems is that BizTalk often is the place where problems with applications you integrate with are highlighted and these external applications may have poor monitoring solutions. Fortunately if you are a working with a customer who uses BizTalk 360 then it contains a feature called the "Web Endpoint Manager". Typically the web endpoint manager is used to monitor web services that you integrate with and will ping them at appropriate times to make sure they return the expected HTTP status code. When you have an usual situation where you want to monitor something which is key to the success to your solution but you find yourself having to consider a significant custom solution to monitor the external dependency then the Web Endpoint Manager could be your friend. The endpoint manager monitors a url and checks for a certain status code. This means that you can create your own aspx web page and then make BizTalk 360 monitor this web page. Behind the web page you could write any code you wished. An example of this is architecture is shown in the below diagram.     In the custom web page you would implement some custom code to do whatever it is that you want to monitor. In the below code snippet you can see how the Page_Load default method is doing some kind of check then depending on the result of the check it returns a certain HTTP code. protected void Page_Load(object sender, EventArgs e) { var result = CheckSomething();   if (result == "Success") Response.StatusCode = 202; else if (result == "DatabaseError") Response.StatusCode = 510; else if (result == "SystemError") Response.StatusCode = 512; else Response.StatusCode = 513;   }   In BizTalk 360 you would go into the Monitor and Notify tab and then to BizTalk Environment which gives you access to the Web Endpoint Manager. You need an alarm setup which configures how the endpoint will be checked. I'm not going to go through the details of creating the alarm as this is already documented in the BizTalk 360 documentation. One point to note is that in the example I am using I setup a threshold alarm which means that the url is checked about every minute and if there is an error that persists for a period of time then the alarm will raise the alert notification. In my example I configured the alarm to fire if the error persisted for 3 minutes. The below picture shows accessing the endpoint manager.   In the web endpoint manager you would then configure your endpoint to monitor and the HTTP response code which indicates all is working fine. The below picture shows this. I now have my endpoint monitoring setup and BizTalk 360 should be checking my custom endpoint to see that it is available. If I wanted to manually sanity check that the endpoints I have registered are working fine then clicking the Refresh button will show if they are all good or not. If my custom ASP.net page which is checking my dependency gets a problem you will see in the endpoint manager that the status code does not match the expected return code and your endpoints will display in red and you can see the problem. The below picture shows this. If I use specific HTTP response codes for the errors the custom ASP.net page might encounter I can easily interpret these to know what the problem is. Using the alarms and notifications with BizTalk 360 it means when your endpoint goes into an error state you can easily configure email or SMS notifications from BizTalk 360 to tell you that your endpoint is having problems and you can use BizTalk 360 to help correlate what the problem is to allow you to investigate further. Below you can see the email which tells me my endpoint is not working.   When everything returns to normal you will see the status is now fixed and you will see a situation like below where you can see the WebEndpoints are now green and the return code matches what is expected.   Conclusion As you can see it is really easy to plug your own custom ASP.net page into the BizTalk 360 web endpoint monitoring feature. This extension then gives you the power to really extend the monitoring to almost anything you want as long as you can write some .net code to check that the dependency is available and working. It would be interesting to hear of any ideas people have around things they would monitor with this extension. More details on the end point monitor can be found on the following link: http://www.biztalk360.com/tour/monitoring_notifications

    Read the article

  • Arcad C1 3d cad

    - by borisha
    Recently I saw a version of Arcad in the Ubuntu software center. What kind of version is this? A trial version, evaluation version or full software for only 32 US? Second question. I tried to buy this software from software center but for some reason my transaction online couldn't end successfully. I contact my bank but they told me insufficent credit but is not possible. Anyway, is another way, like bank transfer order to buy this software?

    Read the article

  • Puppet: Making Windows Awesome Since 2011

    - by Robz / Fervent Coder
    Originally posted on: http://geekswithblogs.net/robz/archive/2014/08/07/puppet-making-windows-awesome-since-2011.aspxPuppet was one of the first configuration management (CM) tools to support Windows, way back in 2011. It has the heaviest investment on Windows infrastructure with 1/3 of the platform client development staff being Windows folks.  It appears that Microsoft believed an end state configuration tool like Puppet was the way forward, so much so that they cloned Puppet’s DSL (domain-specific language) in many ways and are calling it PowerShell DSC. Puppet Labs is pushing the envelope on Windows. Here are several things to note: Puppet x64 Ruby support for Windows coming in v3.7.0. An awesome ACL module (with order, SIDs and very granular control of permissions it is best of any CM). A wealth of modules that work with Windows on the Forge (and more on GitHub). Documentation solely for Windows folks - https://docs.puppetlabs.com/windows. Some of the common learning points with Puppet on Windows user are noted in this recent blog post. Microsoft OpenTech supports Puppet. Azure has the ability to deploy a Puppet Master (http://puppetlabs.com/solutions/microsoft). At Microsoft //Build 2014 in the Day 2 Keynote Puppet Labs CEO Luke Kanies co-presented with Mark Russonivich (http://channel9.msdn.com/Events/Build/2014/KEY02  fast forward to 19:30)! Puppet has a Visual Studio Plugin! It can be overwhelming learning a new tool like Puppet at first, but Puppet Labs has some resources to help you on that path. Take a look at the Learning VM, which has a quest-based learning tool. For real-time questions, feel free to drop onto #puppet on freenode.net (yes, some folks still use IRC) with questions, and #puppet-dev with thoughts/feedback on the language itself. You can subscribe to puppet-users / puppet-dev mailing lists. There is also ask.puppetlabs.com for questions and Server Fault if you want to go to a Stack Exchange site. There are books written on learning Puppet. There are even Puppet User Groups (PUGs) and other community resources! Puppet does take some time to learn, but with anything you need to learn, you need to weigh the benefits versus the ramp up time. I learned NHibernate once, it had a very high ramp time back then but was the only game on the street. Puppet’s ramp up time is considerably less than that. The advantage is that you are learning a DSL, and it can apply to multiple platforms (Linux, Windows, OS X, etc.) with the same Puppet resource constructs. As you learn Puppet you may wonder why it has a DSL instead of just leveraging the language of Ruby (or maybe this is one of those things that keeps you up wondering at night). I like the DSL over a small layer on top of Ruby. It allows the Puppet language to be portable and go more places. It makes you think about the end state of what you want to achieve in a declarative sense instead of in an imperative sense. You may also find that right now Puppet doesn’t run manifests (scripts) in order of the way resources are specified. This is the number one learning point for most folks. As a long time consternation of some folks about Puppet, manifest ordering was not possible in the past. In fact it might be why some other CMs exist! As of 3.3.0, Puppet can do manifest ordering, and it will be the default in Puppet 4. http://puppetlabs.com/blog/introducing-manifest-ordered-resources You may have caught earlier that I mentioned PowerShell DSC. But what about DSC? Shouldn’t that be what Windows users want to choose? Other CMs are integrating with DSC, will Puppet follow suit and integrate with DSC? The biggest concern that I have with DSC is it’s lack of visibility in fine-grained reporting of changes (which Puppet has). The other is that it is a very young Microsoft product (pre version 3, you know what they say :) ). I tried getting it working in December and ran into some issues. I’m hoping that newer releases are there that actually work, it does have some promising capabilities, it just doesn’t quite come up to the standard of something that should be used in production. In contrast Puppet is almost a ten year old language with an active community! It’s very stable, and when trusting your business to configuration management, you want something that has been around awhile and has been proven. Give DSC another couple of releases and you might see more folks integrating with it. That said there may be a future with DSC integration. Portability and fine-grained reporting of configuration changes are reasons to take a closer look at Puppet on Windows. Yes, Puppet on Windows is here to stay and it’s continually getting better folks.

    Read the article

  • Dell Inspiron 1120 Ubuntu Light -> Desktop and now I'm having problems with wifi and suspend

    - by David N. Welton
    I got a Dell Inspiron 1120 which ships with Ubuntu Light, as well as Windows. My wife prefers Ubuntu, but obviously outside of web stuff, you can't do a lot with Light, so I went ahead and installed the Desktop version of Ubuntu (10.10 / maverick). Whereas before it suspended beautifully and connected to wifi networks flawlessly, it now displays the following problems: It seems to suspend ok, but on resume, the screen remains blank, even though the computer appears to wake up again. Wifi doesn't connect. I tried using the suggested proprietary drivers, and those don't seem to change the situation. All in all, a bit frustrating to run into these sorts of "regressions" - does anyone know what sort of drivers and such Ubuntu Light might have shipped with for this computer that made it work so well? Unfortunately, I wiped the disk in order to install the Desktop version of Ubuntu.

    Read the article

  • RDF and OWL: Have these delivered the promises of the Semantic Web?

    - by Dark Templar
    These days I've been learning a lot about how different scientific fields are trying to move their data over to the Semantic Web in order to "free up data from being stored in isolated silos". I read a lot about how these fields are saying how their efforts are implementing the "visions" of the Semantic Web. As a learner (and from purely a learning perspective) I was curious to know why, if semantic technology is deemed to be so powerful, the efforts have been around for years but myself and a lot of people I know have never even heard of it until very recently? Also, I don't come across any scholarly articles deeming "oh, our inferencing engine was able to make such and such discovery, which is helping us pave our way to solving...." etc. It seems that there are genuine efforts across different institutions, fields, and disciplines to shift all their data to a "semantic" format, but what happens after all that's been done? All the ontologies have been created/unified, and then what?

    Read the article

  • Best in-memory cache of DB objects for Silverlight [closed]

    - by Jon
    Hi, I'd like to set up a cache of database objects (i.e. rows in a table) in memory in silverlight, which I'll do using WCF and linq-to-sql. Once I have the objects in memory, I'm planning on using MSMQ to receive new objects whenever they have been modified. It's a somewhat complex approach but the goal is to reduce trips to the database and allow instant data communication between Silverlight applications that are connected to the MSMQ. My Silverlight applications are meant to be long-running and the amount of data to be cached will not be large. I'm planning on saving the in-memory cache using local storage. Anyway, in order to process the updated objects that come in, I'd like to know if the user has changed the existing object. Could I use some event relating to data-binding to set a flag indicating that the object has changes? Maybe there's a better way to do the cache entirely? Thanks!

    Read the article

  • Customer Spotlight: Land O’Lakes

    - by kellsey.ruppel
    Land O’Lakes, Inc. is one of America’s premier member-owned cooperatives, offering local cooperatives and agricultural producers across the nation an extensive line of agricultural supplies, as well as state-of-the-art production and business services. WinField Solutions, a company within Land O’Lakes, is using Oracle WebCenter to improve online experiences for their customers, partners, and employees. The company’s more than 3,000 seed customers, and its more than 300 internal and external sales force members and business partners, use Oracle WebCenter to handle all aspects of account management and order entry through a consolidated, personalized, secure user interface. Learn more about Land O’Lakes and Oracle WebCenter by reading this interview with Barry Libenson, Land O’Lakes chief information officer, or by watching this video.

    Read the article

  • Torchlight II Drops Today; New Classes and Miles of Atmospheric Dungeon Crawling Await

    - by Jason Fitzpatrick
    Torchlight II, sequel to the extremely popular Torchlight action-RPG, is available for sale today. With four new classes and a massively expanded world, you’ll have plenty to explore. The new release features extra classes, extra companion creatures, in-game weather systems, and of course: updated graphics and a massively expanded game universe. Trumping all these additions, however, is LAN/internet co-op multiplayer–by far the feature most requested and anticipated by Torchlight fans. Check out the trailer video above to take a peak at the game, read more about it at the Torchlight II site, and then hit up the link below to grab a copy on Steam–you can pre-order it any time but it won’t be officially available for download until 2PM EST, today. Torchlight II is Windows-only, $19.99 for a single copy or $59.99 for a friend 4-pack (which includes a copy of Torchlight I). Torchlight II How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • Creating an installer for a python GTK3 application

    - by Noam Gal
    I have just finished developing a Python 2.7 application using Gtk3 for GUI. My question is, how can I now create an installer for Windows, Mac, and Linux (possibly three different installers) for my end-users to easily download the application without having to download python and GTK and such. I have never created an installer from a python script before. I have heard that are some tools for this purpose (py2exe? pyinstaller?), but I wouldn't know how and what to pack with them in order for it to be able to use Gtk3. Thanks in advanced, Noam.

    Read the article

  • Google adwords API - credit card safety question

    - by anon
    Hi, Google is asking me to fax credit card xerox in order to activate adwords API in MCC. This really sucks imo... they could have had prepaid option. In any case, my questions: 1) Are there alternatives to this - is there a 3rd party provider who will give me this service without me sending them the credit card info? 2) How secure is it to send my credit card fax via some online fax service? 3) Do you think they will reject the application if I hide my CVV number in the fax? Any other thoughts appreciated :) thanks

    Read the article

  • Tester-Developer communication

    - by HH_
    While a lot is written about developer-developer, developer-client, developer-team manager communications, I couldn't find any text which gives guidelines about tester-developer communication and relation. Whether testers and developers are separate teams or in the same one (in my case, I am a lone tester in an agile development project), I have the belief that how testers are perceived is extremely important in order for testing to be well-accepted, and to serve its goal in enhancing the quality of the project (for example, they should not be viewed as a police force). Any advices, or studies about how a tester should communicate with developers? Thank you

    Read the article

  • Are CQRS/DDD/Event Sourcing and REST compatible?

    - by Robin Green
    REST seems to promote the idea of a canonical URL for a resource, and PUTing/POSTing back a modified representation of that resource in order to change it. However, with CQRS - Command Query Responsibility Segregation - one can theoretically have a completely different "API" for reading and for writing, which seems to conflict with the REST ideal of one URL for a resource, and no RPC-style "verbs inside the request body". DDD and Event Sourcing sometimes go together with CQRS, which is why I mention them in this question. So, can CQRS be used together with REST? Or is it against the REST way of doing things? What about DDD? And Event Sourcing? Can they be used with REST?

    Read the article

  • BizTalk Envelopes explained

    - by Robert Kokuti
    Recently I've been trying to get some order into an ESB-BizTalk pub/sub scenario, and decided to wrap the payload into standardized envelopes. I have used envelopes before in a 'light weight' fashion, and I found that they can be quite useful and powerful if used systematically. Here is what I learned. The Theory In my experience, Envelopes are often underutilised in a BizTalk solution, and quite often their full potential is not well understood. Here I try to simplify the theory behind the Envelopes within BizTalk.   Envelopes can be used to attach additional data to the ‘real’ data (payload). This additional data can contain all routing and processing information, and allows treating the business data as a ‘black box’, possibly compressed and/or encrypted etc. The point here is that the infrastructure does not need to know anything about the business data content, just as a post man does not need to know the letter within the envelope. BizTalk has built-in support for envelopes through the XMLDisassembler and XMLAssembler pipeline components (these are part of the XMLReceive and XMLSend default pipelines). These components, among other things, perform the following: XMLDisassembler Extracts the payload from the envelope into the Message Body Copies data from the envelope into the message context, as specified by the property schema(s) associated by the envelope schema. Typically, once the envelope is through the XMLDisassembler, the payload is submitted into the Messagebox, and the rest of the envelope data are copied into the context of the submitted message. The XMLDisassembler uses the Property Schemas, referenced by the Envelope Schema, to determine the name of the promoted Message Context element.   XMLAssembler Wraps the Message Body inside the specified envelope schema Populates the envelope values from the message context, as specified by the property schema(s) associated by the envelope schema. Notice that there are no requirements to use the receiving envelope schema when sending. The sent message can be wrapped within any suitable envelope, regardless whether the message was originally received within an envelope or not. However, by sharing Property Schemas between Envelopes, it is possible to pass values from the incoming envelope to the outgoing envelope via the Message Context. The Practice Creating the Envelope Add a new Schema to the BizTalk project:   Envelopes are defined as schemas, with the <Schema> Envelope property set to Yes, and the root node’s Body XPath property pointing to the node which contains the payload. Typically, you’d create an envelope structure similar to this: Click on the <Schema> node and set the Envelope property to Yes. Then, click on the Envelope node, and set the Body XPath property pointing to the ‘Body’ node:   The ‘Body’ node is a Child Element, and its Data Structure Type is set to xs:anyType.  This allows the Body node to carry any payload data. The XMLReceive pipeline will submit the data found in the Body node, while the XMLSend pipeline will copy the message into the Body node, before sending to the destination. Promoting Properties Once you defined the envelope, you may want to promote the envelope data (anything other than the Body) as Property Fields, in order to preserve their value in the message context. Anything not promoted will be lost when the XMLDisassembler extracts the payload from the Body. Typically, this means you promote everything in the Header node. Property promotion uses associated Property Schemas. These are special BizTalk schemas which have a flat field structure. Property Schemas define the name of the promoted values in the Message Context, by combining the Property Schema’s Namespace and the individual Field names. It is worth being systematic when it comes to naming your schemas, their namespace and type name. A coherent method will make your life easier when it comes to referencing the schemas during development, and managing subscriptions (filters) in BizTalk Administration. I developed a fairly coherent naming convention which I’ll probably share in another article. Because the property schema must be flat, I recommend creating one for each level in the envelope header hierarchy. Property schemas are very useful in passing data between incoming as outgoing envelopes. As I mentioned earlier, in/out envelopes do not have to be the same, but you can use the same property schema when you promote the outgoing envelope fields as you used for the incoming schema.  As you can reference many property schemas for field promotion, you can pick data from a variety of sources when you define your outgoing envelope. For example, the outgoing envelope can carry some of the incoming envelope’s promoted values, plus some values from the standard BizTalk message context, like the AdapterReceiveCompleteTime property from the BizTalk message-tracking properties. The values you promote for the outgoing envelope will be automatically populated from the Message Context by the XMLAssembler pipeline component. Using the Envelope Receiving Enveloped messages are automatically recognized by the XMLReceive pipeline, or any other custom pipeline which includes the XMLDisassembler component. The Body Path node will become the Message Body, while the rest of the envelope values will be added to the Message context, as defined by the Property Shemas referenced by the Envelope Schema. Sending The Send Port’s filter expression can use the promoted properties from the incoming envelope. If you want to enclose the sent message within an envelope, the Send Port XMLAssembler component must be configured with the fully qualified envelope name:   One way of obtaining the fully qualified envelope name is copy it off from the envelope schema property page: The full envelope schema name is constructed as <Name>, <Assembly> The outgoing envelope is populated by the XMLAssembler pipeline component. The Message Body is copied to the specified envelope’s Body Path node, while the rest of the envelope fields are populated from the Message Context, according to the Property Schemas associated with the Envelope Schema. That’s all for now, happy enveloping!

    Read the article

  • Continue with out a default route?

    - by user2009
    I am doing a complete unattended install of Ubuntu 12.04. I am doing static network configuration. Here is content for Static network configuration from the preseed file. d-i netcfg/disable_dhcp boolean true d-i netcfg/no_default_route boolean true d-i netcfg/get_nameservers string 192.168.1.254 d-i netcfg/get_ipaddress string 192.168.1.13 d-i netcfg/get_netmask string 255.255.255.0 d-i netcfg/get_gateway string 192.168.1.1 d-i netcfg/confirm_static boolean true Still is asking "Continue without a default route?". I have to say , then only installed is going ahead. Am passing preseed file via network (preseed/url). How to avoid this manual intervention? Does the order of netcfg statements matter?

    Read the article

  • Understanding the static keyword

    - by user985482
    I have some experience in developing with Java, Javascript and PHP. I am reading Microsoft Visual C# 2010 Step by Step which I feel it is a very good book on introducing you to the C# language. I seem to be having problems in understanding the static keyword. From what I understand this far if a class is declared static all methods and variable have to be static. The main method always is a static method so in the class that the main method exists all variables and methods are declared static if you have to call them in the main method. Also I have noticed that in order to call a static method from another class you do not need to create an object of that you can use the class name. What are the advantages of declaring static variables and methods? When should I declare static variable and methods?

    Read the article

  • Windows 7 Virtualized on Ubuntu Server

    - by garbagecollector
    I have an issue, we are moving to a production build server now. I need a virtual machine up and running on my Ubuntu 10.10 server edition. I have to setup and install various tool and plugins, on this windows 7 virtual machine as well The problem I am facing is how do I install windows 7 on this machine ( ubuntu 10.10 server) also, how am i supposed to gain access to it in order to install tools that are required on it. I would prefer virtual box as my tool of choice. Please and thank you.

    Read the article

  • How to boot Ubuntu 12.04-64bit from a USB from Compaq CQ58

    - by user208092
    I try to boot Ubuntu 12.04, 64-bit on my Compaq CQ58 laptop from a USB but it is not working. I've correctly installed the Ubuntu on my pen drive following the instructions on Ubuntu website. (http://www.ubuntu.com/download/desktop/create-a-usb-stick-on-windows) These are my BIOS settings: Post Hotkey Delay (sec) <0 CD-ROM Boot Internal Network Adapter Boot Network Boot Protocol Legacy Support Secure Boot Platform Key Enrolled Pending Action None Clear All Secure Boot UEFI Boot Order: USB Diskette on Key/USB Hard Disk OS Boot Manager Internal CD/DVD ROM Drive ! Network Adapter With these settings when i restart my computer what shows up is: Boot Device Not Found. This is what I get on the Boot Manager: Boot Option Menu OS boot Manager Boot From EFI File (Arrow Up) and (Arrow Down) to change option, ENTER to select an option. Press F10 to BIOS Setup Options, ESC to exit. PLEASE HELP... P.S. My laptop has windows 8

    Read the article

  • Is SOA really dead?

    - by Ahsan Alam
    I have come across many articles/blogs where authors have strongly hailed the death of Service Oriented Architecture (SOA). I could almost hear the laughter pouring out of their writings. Being a big supporter of SOA, I have found myself wondering – have I been following the wrong path all along? Do I need to change the way I think? Then I started to look around. Many newer technologies and concepts have evolved in the past few years. People are starting to take advantage of cloud computing, SAAS (Software as a Service), multitudes of on-demand platforms and many more. Now, I started thinking – is SOA really dead? In order to effectively utilize these newer concepts, I believe we need SOA more than ever because it gives us loose-coupling. People often forget that the key principal behind SOA is loose-coupling. We cannot achieve SOA just by throwing services (WCF, Web Service); we need loosely coupled systems.

    Read the article

  • How do I pause and resume apt-fast package download?

    - by jasoncruz98
    I know that in order to speed up apt-get downloads, I can use apt-fast (which uses the aria2c or axel engine - it depends on which one I install during the configuration). But even though it says it can pause and resume downloads, I don't know how to do it, and I can't find any answer online that tells me how to do it. I have no intention of pausing apt-fast update function, I just want the ability to pause the sudo apt-fast install package_name function and resume the downloading of a package in Ubuntu at will using apt-fast (with axel or aria2c) I have seen in some forums that sudo apt-fast update cannot be paused because it requires starting the entire process. Please correct me if I'm wrong. Any help would be much appreciated.

    Read the article

< Previous Page | 222 223 224 225 226 227 228 229 230 231 232 233  | Next Page >