Search Results

Search found 51287 results on 2052 pages for 'commercial application'.

Page 1343/2052 | < Previous Page | 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350  | Next Page >

  • Need to move a debian server from i686 to x86_64 architecture

    - by user64204
    I have a debian server that I need to move from one hosting provider to another. I don't really know how the old server was setup, all I know is that it's running a Ruby on Rails application with a lot of custom libraries installed and that I should prepare myself for a painful migration. Old server: -os: debian 5.0.9 -used disk space: 3.2GB -architecture: i686 New server: -os: debian 5.0.9 -free disk space: 10GB -architecture: x86_64 As you can see the problem is that the servers are running different architectures. Q: Is there anyway I could somehow migrate the old to the new server in a few steps (or am I just dreaming I could) ? I was thinking maybe I could: -get list of packages and gems installed on old server and use for loop to install them all on the new -copy the disk content from old to new server while excluding what is architecture-specific (the problem is that I don't really know what to exclude).

    Read the article

  • Fix Fatal Error Condition showing system path

    - by JMC
    I've noticed there are a large number of servers running Magento Commerce that will return a fatal error showing the system path: Fatal error: Uncaught exception 'Exception' with message 'File '/usr/local/www/magento/data1702/media/css' does not exists.' in /usr/local/www/magento/data1702/lib/Varien/File/Transfer/Adapter/Http.php:96 Stack trace: #0 /usr/local/www/magento/data1702/get.php(205): Varien_File_Transfer_Adapter_Http->send('/usr/local/www/...') #1 /usr/local/www/magento/data1702/get.php(165): sendFile('/usr/local/www/...') #2 {main} thrown in /usr/local/www/magento/data1702/lib/Varien/File/Transfer/Adapter/Http.php on line 96 Magento as an application is generally good about supressing error messages. How can a linux server running apache be configured to avoid returning this error message since the app has problems suppressing it.

    Read the article

  • Windows 7 x64 Hard Freezing (again)

    - by Lanissum
    Awhile ago, my computer was randomly freezing a few minutes after booting, and I ended up replacing the CPU and mobo after testing the RAM and hard drive, I also couldn't find anything wrong with the video card. So after replacing the presumably faulty hardware, everything worked fine for about a month and a half. All of a sudden, My computer is randomly freezing a few minutes after loading up any intensive application (games, mostly). Most of the time it just freezes with the current frame until I hard reset, although once it printed a BSOD message stating that dxgmms1.sys was to blame. The only difference between these two episodes I can think of is that I can do word/internet/work without issue now, as opposed to the near uselessness my computer was rendered last time. For those of you who want to know, I tested my memory with memtest86 (for 64 bit machines). I can't figure out what could have started this latest round of issues, the event logger just states that a kernel-power event has occurred (like last time) but I think thats just a generic "this machine has rebooted after a sudden shutdown" message.

    Read the article

  • Is an I/O benchmark made for hardware an accurate assessment of a Windows VM's performance under vSphere 5?

    - by Jeremy
    We support an enterprise application running on Windows Server 2008 R2. One of our customers has chosen to install to VMWare, and what I'm finding is that the VM's are relatively slow compared to hardware. Our product development team has advised that many VMs appear to run particularly slow on I/O benchmarks, which impact performance in production. I've tried the AttoSoft I/O benchmark and find that for smaller I/O blocks (1-32K) the VM I'm looking at is 25x slower than hardware and for larger I/O blocks (1-8MB) it's 10x slower. Is this a fair benchmark? If not, any suggestions for a fair test?

    Read the article

  • When can an FTP server close its passive connections?

    - by Don Kirkby
    Does the FTP protocol allow the server to close any of its passive connections while the client is still connected? Can it tell when the client is finished receiving and then close the connection? I'm including an FTP server in my application using the pyftpdlib Python project. I've got it to work in active and passive mode, but I'm a bit concerned about when it closes its passive connections. I've tried connecting to it with both FileZilla and the default ftp command in Ubuntu, and in both cases, I get a new passive port for every request. That is, if I sit in the root folder and type ls 10 times, I use up 10 ports. This means that I have to allocate a big block of passive ports for the FTP server to use so it won't run out. As soon as the client disconnects, the server releases all the passive connections associated with that client and those ports can be reused. However, a long-running connection could use up a lot of ports.

    Read the article

  • can I display a JPG or PNG to the framebuffer (/dev/fb*)?

    - by ndmweb
    I know I can capture the framebuffer in linux using something like cp /dev/fb0 ~/myimage and re-display that by coping back to the device like so cp ~/myimage /dev/fb0. What format is the framebuffer image data in? and how would I go about displaying a pre-made image (jpg, png) to the framebuffer? Can I convert to this format using imagemagick? p.s. Im using a raspberry pi running raspbian. Update 11-12-2012 I ended up using pygame to display images in my application. Not sure if this uses the frame-buffer to display the images. But it meets my needs quite well.

    Read the article

  • Setting Rails up on a Linode - Nginx Issue

    - by rctneil
    I am extremely new to this so please don't shoot me down: I have set up a Linode running Ubuntu, It is all sort of working except Nginx. I am following this guide: http://rubysource.com/deploying-a-rails-application/ And this for nginx: http://library.linode.com/web-servers/nginx/installation/ubuntu-10.04-lucid When I go to my IP, I get a 500 internal server error. I have tried starting nginx and it looks like it starts fine. I run this: ps awx | grep nginx and I get: 308 ? Ss 0:00 nginx: master process /usr/sbin/nginx 2309 ? S 0:00 nginx: worker process 2311 ? S 0:00 nginx: worker process 2312 ? S 0:00 nginx: worker process 2313 ? S 0:00 nginx: worker process 2850 pts/0 S+ 0:00 grep --color=auto nginx I really am not sure what else to do to get it running. Any help? Neil

    Read the article

  • Can I find the session ID for a user logged on to another machine?

    - by Dan
    I want to open an application on another computer on the same network via the command line. The scenario here is that the user is in a room surrounded by about 20 computers and wants to be able to launch the same app on every computer without walking from screen to screen opening it up on each individual machine. I've discovered that I can get the basic functionality for this using PsExec as follows: psexec \\[computer] -u [username] -p [password] -d -i [SessionID] [program] For computer, username, password, and program, I'm good. Does anyone know of a way I can figure out which SessionID is assigned to a particular user logged on to a particular machine on the network? Alternately, is there a better way to go about what I'm trying to accomplish?

    Read the article

  • Can only connect to IIS site through localhost

    - by Rembrandt Q. Einstein
    I'm building a web service for my company's iPhone application, and everything's been working smoothly by running tests through localhost on the development machine. I'm now in the phase where I need to test connections from other computers within the network, and any connection other than localhost gives me a 404. My internal IP, 127.0.0.1, and computername all get 404 when connecting from any computer, either the one the site's hosted on or any others on the network. Telnet can get through to port 80, and I've temporarily disabled all firewalls on this machine (I do not have control over the external firewall, but I'm only testing connections within the network) Does anyone have a clue why this is happening? I was able to connect to the web service from other computers when hosted on a Mac via Apache, but because I'm now using a SQL Server connection I'm restricted to using IIS for Windows Authentication. Googling only provided answers related to firewalls, and mine is disabled note: I cannot use Anonymous Authentication, but even in testing that it did not affect the issue.

    Read the article

  • Host a Debian repository on a Windows Web/Ftp server

    - by Dave
    At the risk of causing a matter vs. antimatter paradox that would end the world as we know it ... Is it possible to host a Debian repository on a Windows server? We have some applications which are available for Windows, Mac OS X, and Linux. Our web site, from where the application can be downloaded, is a Windows Server 2008 box running IIS 7. That is not going to change, and I would like to avoid having to purchase another server and/or domain. I would like to take advantage of the Debian packaging system so that I can just instruct users to add our repository to their software sources, and then they can install, get updates, resolve dependencies (some of which are not yet in the stable/main distributions of my target platforms), etc. The instructions I can find on the internet require linux-specific tools to create a local repository, but are unclear as to whether or not that can be copied to an FTP site as is, or if it requires some local daemons to be running or something.

    Read the article

  • Advanced Terminal / Console apps for Mac OS X?

    - by Jakob Egger
    I use a lot of command line programs, very often with similar arguments. Can anyone recommend an application or a workflow that allows me to store often used shell commands and search through my recent commands, using a GUI? I have commands that I use very often (eg. rsync a specific directory to a server) and other commands that I use less often. Creating shell scripts for every code snippet I might reuse seems a bit awkward. Especially for programs that I use seldomly, I end up reading the docs over and over again, because I forgot to write down the exact shell command. Ideally I would like an app that's just like Terminal.app, but provides some kind of history and snippet management. What do you use to keep track of shell commands?

    Read the article

  • rsync server side limit bandwidth/connection

    - by c2h2
    In a VOIP application, I have upto 3000 clients rsync audio files from there linux server in a daily, server is placed at a data center (10Mbps in/out bound), the server works as a VOIP sip server running FreeSWITCH (low ping latency should be ensured.) Therefore I would like to have server side control of rsync which controls: Limit total outbound bandwidth. Limit total number of connections. (Reject clients while at max number of connection and let it retry after a specific time frame.) OPTIONAL: list/kill individual connections. Normally I would use ssh + rsync + pem_keys with some extra options, but above requirements are not feasible by simple command lines. Can anyone point me some direction. or show some scripts/tools? I would also probably integrate them and release on github. Thanks!

    Read the article

  • Smart backup software

    - by gisek
    I use a laptop on daily basis. As I have a lot of important data there it would be nice to do backups of some directories every day. Can you recommend a specific application that would take care of it? Maybe there is an app that would instantly commit changes I make in a directory on my laptop to the backup folder? The important thing is that I have some big files (a few GB's) that have some minor changes very often. I'm talking about VirtualBox disk images. It would be nice if the software could handle it smartly. Also notice that I'd like to store it on an external usb HDD, which sometimes isn't plugged in.

    Read the article

  • Logical move of a server to UK, what do I do with the SSL certificates

    - by flyfishr64
    I have been asked to move a rails application from the US to the UK. This involves bringing up the rails stack on Ubuntu 8.04.4; that's completed. I'm stumped with the SSL configuration though. The plan was to bring this server up with the same domain name but temporarily use a subdomain (app2.xxx.com instead of app.xxx.com) during the move and for testing, then rename it to app.xxx.com when we're ready for the cutover (does that make sense?). In the meantime, we need a new cert for the app2 subdomain. So to generate a CSR, I need a server key but do I need a new one, or should I copy the one from the existing production server?

    Read the article

  • need for tcp fine-tuning on heavily used proxy server

    - by Vijay Gharge
    Hi all, I am using squid like Internet proxy server on RHEL 4 update 6 & 8 with quite heavy load i.e. 8k established connections during peak hour. Without depending much on application provider's expertise I want to achieve maximum o/p from linux. W.r.t. that I have certain questions as following: How to find out if there is scope for further tcp fine-tuning (without exhausting available resources) as the benchmark values given by vendor looks poor! Is there any parameter value that is available from OS / network stack that will show me the results. If at all there is scope, how shall I identify & configure OS tcp stack parameters i.e. using sysctl or any specific parameter Post tuning how shall I clearly measure performance enhancement / degradation ?

    Read the article

  • Exchange 2010 550 5.7.1 unable to relay

    - by isorfir
    I have a website application that needs to send email via our Exchange servers. It sends email internally fine, but when sending to an external address I get the 550 5.7.1 unable to relay error. I followed this guide to create a connector to allow relay. Unfortunately, all office email was trying to use that connector and was not being routed correctly. It also appeared as though it opened it up for spammers to use. This is obviously unacceptable and a secure method is needed.

    Read the article

  • Thunderbird 3.0 refuses to start on Mac OS X?

    - by jtimberman
    I just downloaded Thunderbird 3.0 on my Macbook Pro running Leopard, and installed it in /Applications. When I attempt to start it, the icon opens on the Dock as normal, but I get the following dialog. I don't have a ~/Library/Application Support/Thunderbird directory at all, let alone a .parentlock file. While I didn't expect it to help, I did reboot my system. And sign out and back in. And close all programs besides Thunderbird. Earlier versions of Thunderbird have worked just fine.

    Read the article

  • Best idea dataserver serving small pictures 40 ko

    - by Nicolas Manzini
    I'm designing the server structure for my application in case things go well. I have one server DB connected to multiple server who process connections. All those with lots of RAM and fast processors. (still looking for a way to use the multithread because now it's dumb apache php... so loooots of ram needed). Upon an answer from those servers, the client can then connect to another server to retrieve pictures using the address he previously got from the db. Is it a good idea to have one database server with let's say nginx and ssd disk having to send all pictures to everybody? or should I have multiple server accessing to a shared ssd disk drive or multiple disk updating each other? Also should I put a lot of RAM on the database server? because probably there wont be a picture more popular than another.

    Read the article

  • What are the default/recommendet access rights for %ALLUSERSPROFILE%?

    - by RED SOFT ADAIR-StefanWoe
    We have a Windows application that reads and writes some data for all users. We place it at %ALLUSERSPROFILE%\OurProgram*.* We now encounter a few cases in larger companies, where users do not have write permission to %ALLUSERSPROFILE%. Most of these cases are running Windows 7. The problem does not occur on a normal desktop installation of Windows 7 though. What is the recommended policy for this location? I have not found any "official" information about this. Is there a different location where all users have write permission?

    Read the article

  • Enable POST on IIS 7

    - by user26712
    Hello, I have a WCF service that requires POST verb. This service is hosted in a ASP.NET application on IIS 7. I have successfully confirmed that GET works, but POST does not. I have the following two operations, GET works, POST does not. [OperationContract] [WebInvoke(UriTemplate = "/TestPost", BodyStyle = WebMessageBodyStyle.Bare, RequestFormat = WebMessageFormat.Json, ResponseFormat = WebMessageFormat.Json)] public string TestPost() { return "great"; } [OperationContract] [WebGet(UriTemplate = "/TestGet", BodyStyle = WebMessageBodyStyle.Bare, RequestFormat = WebMessageFormat.Json, ResponseFormat = WebMessageFormat.Json)] public string TestGet() { return "great"; } When I try to access TestPost, I receive a message that says: "Method not allowed". Can someone help me configure IIS 7 to allow POST requests? Thank you!

    Read the article

  • Tool to launch a script driven by modem activity

    - by Will M
    Can anyone suggest a software tool (preferably under Windows XP or later) that would launch an application or script in response to a phone call being received on a landline phone line connected to a data modem on the same PC? or, better, in response to a sequence of touch-tones being played over such a phone line. This would allow, for example, using the telephone to manipulate firewall settings so as to create another layer of security in connection with remote internet access to that computer. I seem to recall seeing tools to do this sort of thing in the days before broadband internet access, when there was more attention to various tips and tricks for the dial-up modem, but a few attempts at Google hasn't turned anything up.

    Read the article

  • Apache Balancing by source IP

    - by Daniel
    I am using Apache's Proxy Balancer to balance one sub domain (e.g. subdomain.domain.com) to an application which is located on 2 servers. Here an extract from my Apache configuration file: <Proxy *> Order deny,allow Allow from all </Proxy> <Proxy balancer://cluster1> BalancerMember http://server1:28081 route=w1 BalancerMember http://server2:28082 route=w2 </Proxy> ProxyPass /path balancer://cluster1/path ProxyPassReverse /path balancer://cluster1/path My question is, if it's possible to decide with the source IP-address which BalancerMember should be used for the request? To e.g. Requests from 1.2.3.4 to Member 1?

    Read the article

  • How to make sure you server NIC performance is at best on Windows?

    - by Bobb
    I realised that I followed some obscure paper on setting NICs on Windows for too long. It might be outdated with new hardware released in past couple of years and with W2008R2. I read a bit about offloading and RSS settings on Windows and I realiased that it is very much circumstantial. Noone can really say - enable that and disable this. etc. So what I really want is for my next server try and setup testing environment and measure how my particular application will behave with different settings. The target is going to be latency of TCP primarily. Please note I am talking about latency inside the box. Are there precision tools for Windows to measure latency (down to microseconds)? P.S. I know this is not easy question. Windows time drift is awful problem for any precision test but still I am sure I am not the fist person to need that... Please share your experience

    Read the article

  • Prevent the "System" process from locking my files in a shared folder.

    - by Kamarey
    I have an application that creates files to be processed by SQL bulk. The files are created in shared folder on another server and than taken from there by SQL. The problem that sometime SQL returns an error, that the file is locked by another process and can't be accessed. The process that locks these files is "System" process. Looks like it lock files because of they are in a shared folder, but not sure. The use of any software to unlock files manually is not an option, as all bulk process is automatic. The question is: Why the "System" process locks these files and is there a way to prevent this?

    Read the article

  • Lotus notes 8.5 quota

    - by Cividan
    we're using lotus notes 8.5 and I have a user who was over his quota as he had sent 6 email with attachement over 800 MB (no comment...) I deleted these oversized email and empty the trash but domino keep sending email warning about quota. I checked in the all documents view and they are no longer there, I re-did an empty the trash. I saw a post on the internet saying to compact his database, when I go under file, application, properties and click on the info tab, I see that he use 35.7% of the 3 GB database. when I click on "compact" I see a message saying the compact of the database is beeing process... the message disapear after about 1 minutes the message disapear but nothing else seem to happen and when I look back later on the space problem has not changed. any advice would be appreciated.

    Read the article

< Previous Page | 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350  | Next Page >