Search Results

Search found 10622 results on 425 pages for 'shared hosting'.

Page 207/425 | < Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >

  • Community Forum at Openworld - Presentations available

    - by Javier Puerta
    Thanks to all of you who participated at the Exadata & Manageability Partner Community session that we ran during Oracle Openworld in San Francisco. Very special thanks to the partner speakers who shared their experiences with the rest of the community! Presentation On October 1st we held a new session of the Exadata & Manageability Partner Community in San Francisco. Thanks to all of you who participated in the event and very especially to the partner speakers who shares their experiences with the rest of the community: Francisco Bermúdez (Capgemini Spain), Dmitry Krasilov (Nvision, Russia) and Miguel Alves (WeDo Technologies, Portugal)The slide decks used in the presentations are now available for download at the Exadata Partner Community Collaborative Workspace (for community members only - if you get an error message, please register for the Community first).In a few weeks we will be announcing the location for the next Community event in the spring timeframe.

    Read the article

  • IP fail-over address. Do i need it?

    - by Jon
    I received an email from my web hosting provider where i have 2 dedicated servers saying that from now on I have to pay for my IP fail-over addresses. The server we have hosts a tool used internally by our company. Traffic to it is quite low. No more than 3 people will use it at the same time. If something happens we can wait a day to have the tool up and running again. Is it worth having these fail-over addresses? thanks

    Read the article

  • How can I upgrade from Ubuntu 9.10 to 11.10?

    - by Chinnu
    We need to program in CUDA 5.0 which can be installed only on ubuntu 11.10 or 12.04. Our current version, 9.10, is no longer supported, so we chose to proceed with a clean installation. Since we have a shared workstation, we used clonezilla for cloning the system. However, booting from the LiveCD showed an unexpected error. We also tried to install 11.10 in an external HDD by partitioning it, but Gparted could not be installed, and terminated with the error "installArchives() failed" which we couldn't solve even after modifying the sources.list. Is there a way to proceed with this upgrade?

    Read the article

  • Software Architecture

    - by Roger
    I have a question about Software Architecture, anyone can help me or give me some hints currently, I have a J2EE project which deploys in a server, I should a Java Standard project(J2SE) should run 24 hours x 7 days to monitor something it could not run separately, because the Java Project shared the some same classes such as Java Bean classes with the J2EE project maybe my design is not correct, can anyone suggest me what should I do? Using SOA? is this correct? my current solution is run this java project using a bash, but I dont think it is then best idea. I list my class packages com.company.alteck com.company.altronics com.company.gamming com.company.jaycar com.company.jup com.company.rpg com.company.sansai com.company.wiretech com.company.yatsal com.ebay.api com.ebay.bean com.ebay.credential com.ozsstock.finals com.ozstock.adapter com.ozstock.aspectj com.ozstock.model com.ozstock.persistence com.ozstock.service com.ozstock.suppliers my structure likes this, all the packages contains "company" should run separately, but depends on the model bean class. can anyone give me some hints to redesign?

    Read the article

  • How do I implement URL rewriting in my .htaccess file?

    - by Alan
    I'd like to do some URL rewriting (Why? See this question.) so that instead of users seeing addresses like labouseur.com/course-compilers.html they can instead see and use simply labouseur.com/course-compilers (Even better, maybe I should restructure that so that it's courses/compilers.) I'm using a Linux-based shared hosting service for my website, so I do not have administrative control of the server, but I do have control over .htaccess. The references I've read online seemed less than clear to me, so I'm looking for a little clarity and advice here. Thanks!

    Read the article

  • Upgrade from ubuntu 9.10 to 11.10

    - by Chinnu
    Our project definition is to develop CUDA programs. Our workstation has CUDA 3.1 installed in Ubuntu 9.10. We need to program in CUDA 5.0 which can be installed only on ubuntu 11.10 or 12.04. We tried upgrading but were faced with many problems as 9.10 is no longer supported. So we chose to proceed with a clean installation. Since we have a shared workstation, we need to back up the settings. We decided to use clonezilla for cloning the system. Booting from the LiveCD showed an unexpected error. Another option was to install 11.10 in an external HDD by partitioning it, but Gparted could not be installed and terminated with the error "installArchives() failed" which we couldn't solve even after modifying the sources.list. We are stuck either ways. Have no idea how to proceed and we have a deadline to submit our CUDA program. Any suggestion is welcome.

    Read the article

  • backup dedicated server runing ubuntu 10.04 and plesk 11.01 prior to update os to uduntu 12.04

    - by timmob
    i would like to backup my dedicated server which is my web server hosting various sites and email, so that I can update the os to Ubuntu 12.04, and basically restore back to 10.04 if things go wrong. I have a local machine that I can install 12.04 onto an then I was going to rsinc between the two, but I am fairly clueless when it comes to linux. I can ssh into the remote server and gain root access. can anyone explain if i need to backup the whole server hard drive or just some of the files? Thanks Timmo.

    Read the article

  • Magento not responding to payment gateway notifications fast enough or at all?

    - by robgt
    Some of our customers are getting to the confirmation of payment step in purchasing from our Magento store, and then they are getting a timeout error, where the SagePay payment gateway is trying to contact our server to tell it that a payment was successful (or not…) but it cannot contact our server, or cannot get a response from our server in a timely manner, and then the payment/order is being cancelled. I've raised this question to my hosting company, but all they told me was: "This is down to the way the software is configured on your server" This is currently a Magento 1.4.0.1 standard installation as far as payment gateways are concerned. What on earth could this statement mean? Is there some configuration that I need to do to make Magento listen to these requests and respond properly?

    Read the article

  • Next Quarterly Customer Update Webcast is Nov 27th (Nov 28th in Asia Pacific)

    - by John Klinke
    Join the WebCenter team as we present the latest product direction that was recently shared at the Oracle OpenWorld conference in San Francisco last month.   This Oracle WebCenter Quarterly Customer Update Webcast is scheduled on Nov 27th (Nov 28th in Asia Pacific). We will also be sharing the latest product updates and key support announcements that all WebCenter professionals and solution owners need to know. Don’t miss out on getting the latest information.  There will be two live sessions with Q&A at the end of each session.   Register for Session 1 -  Nov 27th at 9am San Francisco, 12pm New York, and 5pm London Register for Session 2 – Nov 28th at 9am Singapore, 11am Sydney, and 6pm (Nov 27th) San Francisco

    Read the article

  • Hands-on GlassFish FREE Course covering Deployment, Class Loading, Clustering, etc.

    - by arungupta
    René van Wijk, an Oracle ACE Director and a prolific blogger at middlewaremagic.com has shared contents of a FREE hands-on course on GlassFish. The course provides an introduction to GlassFish internals, JVM tuning, Deployment, Class Loading, Security, Resource Configuration, and Clustering. The self-paced hands-on instructions guide through the process of installing, configuring, deploying, tuning and other aspects of application development and deployment on GlassFish. The complete course material is available here. This course can also be taken as a paid instructor-led course. The attendees will get their own VM and will have plenty of time for Q&A and discussions. Register for this paid course. Oracle Education also offers a similar paid course on Oracle GlassFish Server 3.1: Administration and Deployment.

    Read the article

  • What does backup procedures and troubleshooting guidelines mean for a system

    - by Podolski
    I am writing the documentation for a piece of software which I have made but I don't understand what it means in some aspects. It asks me to write about backup procedures but what exactly does this mean? Does this mean like backing up the database on another hosting service or something else entirely? I am dumbfounded by what troubleshooting guidelines are as well. If you have any idea what this could mean feel free to give your insight even if you aren't 100% sure in case it could spark what it means inside of me. Thanks.

    Read the article

  • Data Source Security Part 4

    - by Steve Felts
    So far, I have covered Client Identity and Oracle Proxy Session features, with WLS or database credentials.  This article will cover one more feature, Identify-based pooling.  Then, there is one more topic to cover - how these options play with transactions.Identity-based Connection Pooling An identity based pool creates a heterogeneous pool of connections.  This allows applications to use a JDBC connection with a specific DBMS credential by pooling physical connections with different DBMS credentials.  The DBMS credential is based on either the WebLogic user mapped to a database user or the database user directly, based on the “use database credentials” setting as described earlier. Using this feature enabled with “use database credentials” enabled seems to be what is proposed in the JDBC standard, basically a heterogeneous pool with users specified by getConnection(user, password). The allocation of connections is more complex if Enable Identity Based Connection Pooling attribute is enabled on the data source.  When an application requests a database connection, the WebLogic Server instance selects an existing physical connection or creates a new physical connection with requested DBMS identity. The following section provides information on how heterogeneous connections are created:1. At connection pool initialization, the physical JDBC connections based on the configured or default “initial capacity” are created with the configured default DBMS credential of the data source.2. An application tries to get a connection from a data source.3a. If “use database credentials” is not enabled, the user specified in getConnection is mapped to a DBMS credential, as described earlier.  If the credential map doesn’t have a matching user, the default DBMS credential is used from the datasource descriptor.3b. If “use database credentials” is enabled, the user and password specified in getConnection are used directly.4. The connection pool is searched for a connection with a matching DBMS credential.5. If a match is found, the connection is reserved and returned to the application.6. If no match is found, a connection is created or reused based on the maximum capacity of the pool: - If the maximum capacity has not been reached, a new connection is created with the DBMS credential, reserved, and returned to the application.- If the pool has reached maximum capacity, based on the least recently used (LRU) algorithm, a physical connection is selected from the pool and destroyed. A new connection is created with the DBMS credential, reserved, and returned to the application. It should be clear that finding a matching connection is more expensive than a homogeneous pool.  Destroying a connection and getting a new one is very expensive.  If you can use a normal homogeneous pool or one of the light-weight options (client identity or an Oracle proxy connection), those should be used instead of identity based pooling. Regardless of how physical connections are created, each physical connection in the pool has its own DBMS credential information maintained by the pool. Once a physical connection is reserved by the pool, it does not change its DBMS credential even if the current thread changes its WebLogic user credential and continues to use the same connection. To configure this feature, select Enable Identity Based Connection Pooling.  See http://docs.oracle.com/cd/E24329_01/apirefs.1211/e24401/taskhelp/jdbc/jdbc_datasources/EnableIdentityBasedConnectionPooling.html  "Enable identity-based connection pooling for a JDBC data source" in Oracle WebLogic Server Administration Console Help. You must make the following changes to use Logging Last Resource (LLR) transaction optimization with Identity-based Pooling to get around the problem that multiple users will be accessing the associated transaction table.- You must configure a custom schema for LLR using a fully qualified LLR table name. All LLR connections will then use the named schema rather than the default schema when accessing the LLR transaction table.  - Use database specific administration tools to grant permission to access the named LLR table to all users that could access this table via a global transaction. By default, the LLR table is created during boot by the user configured for the connection in the data source. In most cases, the database will only allow access to this user and not allow access to mapped users. Connections within Transactions Now that we have covered the behavior of all of these various options, it’s time to discuss the exception to all of the rules.  When you get a connection within a transaction, it is associated with the transaction context on a particular WLS instance. When getting a connection with a data source configured with non-XA LLR or 1PC (using the JTS driver) with global transactions, the first connection obtained within the transaction is returned on subsequent connection requests regardless of the values of username/password specified and independent of the associated proxy user session, if any. The connection must be shared among all users of the connection when using LLR or 1PC. For XA data sources, the first connection obtained within the global transaction is returned on subsequent connection requests within the application server, regardless of the values of username/password specified and independent of the associated proxy user session, if any.  The connection must be shared among all users of the connection within a global transaction within the application server/JVM.

    Read the article

  • How do I install a driver for a Kodak esp 3250 printer?

    - by user108608
    First my system: pentium 4 -don't remember the speed-, 1g ram, dual boot to separate physical drives, Fdos and Lubuntu 12.1 second my lan: I have four computers operating for the same printer. Intel quad core i5, 4g ram, running Windoze 7 64 bit, printer connected and shared from here. Kodak ESP 3250 Gateway 17" laptop running Windoze 7 32bit Asus tablet (small laptop) running Lumbutu 12.1 My dual boot system running Fdos and Lubuntu 12.1 The problem: I downloaded c2esp_25c-1_i386.deb, tried to install it using DEBI Package Installer, it loads the files, looks for cups driver and ends with an error: "Dependancy is not satisfiable: libcupsdriver1 (=1.4.0)" What do I do now? Is there some place that I can get the correct cups driver? further information: The Asus tablet was running Ubuntu 12.1 (very slowly and with a few crashes) and could print from the lan printer with no problems. Is there something in Ubuntu that can be loaded into Lubuntu?

    Read the article

  • Game Asset Storage: Archive vs Individual files

    - by David Colson
    As I am in the process of creating a 3D c++ game and I was wondering what would be more beneficial when dealing with game assets with regards to storage. I have seen some games have a single asset file compressed with everything in it and other with lots of little compressed files. If I had lots of individual files I would not need to load a large file at once and use up memory but the code would have to go about file seeking when the level loads to find all the correct files needed. There is no file seeking needed when dealing with one large file, but again, what about all the assets not currently needed that would get loaded with the one file? I could also have an asset file for each level, but then how do I deal with shared assets This has been bothering me for a while so tell me what other advantages and disadvantages are there to either way of doing things.

    Read the article

  • Unable to connect with IIS7 Manager to remote site

    - by saifkhan
    I was unable to connect with IIS7 manager to a remote site. I got on the phone with the hosting provider and they started troubleshooting. After a few minutes they went over all my settings, username, password...the whole shebang and I still couldn't. I then asked the support tech if any ports needed to be opened on my side and she said "ONLY PORT 80 NEEDS TO BE OPENED"... after a few more mins I decided to hop over to the IIS7 website but still couldn't find anything incicating specific ports, but I did came across a doc mentioning 8172 as a port IIS7 uses so I went to my firewall did the following OPEN PORT 8172 OUTBOUND That did the trick!...and the support tech updated her document accordingly.

    Read the article

  • should singleton be life-time available or should it be destroyable?

    - by Manoj R
    Should the singleton be designed so that it can be created and destroyed at any time in program or should it be created so that it is available in life-time of program. Which one is best practice? What are the advantages and disadvantages of both? EDIT :- As per the link shared by Mat, the singleton should be static. But then what are the disadvantages of making it destroyable? One advantage is it memory can be saved when it is not useful.

    Read the article

  • GitHub OS project how to have a good version and a work in progress version

    - by Para
    I have started my own OS application, I am hosting it on GitHub. My problem is that I push changes to the repository from more than one location so sometimes I want to work on it and sometimes I can't always finish something in time but I would still like to push it anyway so I can fetch it later from my other location. I'd like to be able to somehow have a stable version and have the master branch be a 'work in progress'. How do I do this? Is there some button I can push that will take the code from my master branch and make it into a zip file in my downloads tab and call it a version or should I do this by hand? Would it be better to have the master branch be nice and neat and have a separate branch to play with and then merge the two when the time is right? Would this not cause more problems in the merging phase?

    Read the article

  • Forked a project, where do my version numbers start?

    - by TheLQ
    I have forked a project and have changed lots of it. This fork isn't just a small feature change here and a buried bug fix there, its a pretty substantial change. Only most of the core code is shared. I forked this project at v2.5.0. For a while I've started versioning my fork at v3.0 . However I'm not sure if this is the right way, mainly because when that project hits v3.0, things get confusing. But I don't want to start over at v1.0 or v0.1 because that implies infancy, instability, and non-refindness of a project. This isn't true, as most of the core code is very refined and stable. I'm really lost on what to do, so I ask here: Whats the standard way to deal with this kind of situation? Do most forks start over again, bump up version numbers, or do something else that I'm not aware of.

    Read the article

  • Azure website that talks to third party services

    - by Andy Frank
    I have website that crawls data from many third party services when user browse to webpage. This can be really slow because I hit third party server and process returned data before showing it to user. I am hosting website on Azure (shared mode). I am thinking to improve my implementation. Here is what I am thinking... Run a service that crawls data from third party services, process it and then store it in database. when user browse to my site, my site pulls data from database and display them to user. But above solution is not clear to me. Should I have normal service or wcf service? If wcf service then should website talk to database or wcf service (that can access data from database)? If normal service then how can I deploy on Azure?

    Read the article

  • Hulu desktop stopped working on my Dell

    - by jwdinkc
    After last weeks flash update Hulu desktop no longer works on my Dell laptop though it still works on my HP Desktop. Here's what CLI tells me on the Dell: Inspiron-1564:~$ huludesktop Failed to open VDPAU backend libvdpau_nvidia.so: cannot open shared object file: No such file or directory I tried: sudo apt-get install libvdpau_nvidia.so but got E: Unable to locate package libvdpau_nvidia.so and E: Couldn't find any package by regex 'libvdpau_nvidia.so' Hulu does work through the browser and through XBMC. XBMC just doesn't seem to match the video quality of the Hulu Desktop. I don't really know why a nvidia.so is needed for my Intel graphics that comes on Dell Insiron 1564's. anyway. So, do you guys have a solution?

    Read the article

  • How to build completely modular web applications

    - by Webnet
    In the coming months we're going to begin a project where we take a system we've built for a client (v1) and rebuild it from scratch. Our goal with v2 is to make it modular, so that this specific client will have their own set of modules they use, then another client may use a different set of modules altogether. The trick here is that Company A might have a series of checkout and user modules that change how that system works. Company B might stick with the standard checkout procedure but customize how products are browsed. What are some good approaches to application architecture when you're building an application from scratch that you want to have a Core that's shared among all clients while still maintaining the flexability for anything to be modified specifically for a client? I've seen CodeIgniter's hooks and don't think that's a good solution as we could end up with 250 hooks and it's still not flexible enough. What are some other solutions? Ideally we won't need to draw a line in the sand.

    Read the article

  • JavaOne 2012 Content Catalog is Available

    - by arungupta
    JavaOne 2012 Content Catalog is now available! The complete list of technical sessions, birds-of-feather, hands-on labs, tutorials and other details are available. We are still working on the overall schedule and it will be shared in the coming days. The conference will be held in San Francisco from September 30th to October 4th, 2012. You can also connect using the usual social media channels: facebook, twitter, blogs, linkedin, and mix. Oracle Open World, running in parallel to JavaOne, also has the content catalog available.

    Read the article

  • Moving dozens of existing standalone retail sites to one central inventory database: what should I know going in?

    - by palintropos
    This will be the first project of this scale that I have attempted, and the first time I have run a website at all (much less dozens) using an off-site database. In particular, I'd like to know: what sort of optimizations I should read up on to make this run as smoothly as possible? any pitfalls/gotchas wiser, more experienced folk are aware of I should be on the lookout for, and what damage-control and preventative measures I should take against the nightmare scenario of the main server (hosting the database) having an outage, grinding over 100 websites to a halt (because they have no access to the product data).

    Read the article

  • DIY Wirelessly Charged LED Lanterns

    - by Jason Fitzpatrick
    Earlier this year we shared a clever project that turned LEDs, batteries, and PVC into mini, waterproof, and virtually indestructible lanterns. This remake of the project makes the units rechargeable. Our favorite part about this project–the upgrade to an older project and the introduction of wireless charging aside–is the fact that the maker behind it is 15 years old. It’s great to see younger people taking an interest in tinkering! Wirelessly Charged Indestructible Lantern [via Hacked Gadgets] HTG Explains: What Is Two-Factor Authentication and Should I Be Using It? HTG Explains: What Is Windows RT and What Does It Mean To Me? HTG Explains: How Windows 8′s Secure Boot Feature Works & What It Means for Linux

    Read the article

  • July 7th - Java 7th launch

    - by alexismp
    Java 7 is around the corner and Oracle is hosting a multi-city launch event on July 7th called "Java 7, Moving Java Forward". This event will be held simultaneously in Oracle HQ (Redwood Shores, CA), in Sao Paulo, Brazil and in London UK to celebrate to almost-ready version 7 of Java, the first one in five years! In addition to the live event and the ability for you to attend in person if you're in one of those cities, many Java User Groups are planning Java 7 meetings on the same day or soon after, so check your favorite JUG's upcoming meetings. Chances are there's a Java 7 event nearby. Tori has all the details for this launch event over on the OTN blog. Register directly here.

    Read the article

< Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >