Search Results

Search found 5781 results on 232 pages for 'cheap hosting'.

Page 159/232 | < Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >

  • Identify Long Running or Slow PHP Scripts

    - by Kirk
    I have web server that is getting around 25K visits a day up at yougetsignal.com. Sometimes the site feels a bit sluggish. I am hosting it on nginx with php5-fpm. Is there a way for me to see a list of all of the long running requests that are coming to the site? I'd love to have a real-time list of all of the active requests that PHP is handling and how long they have been running. Kind of like top, but just for the web server. This would let me know how long requests are taking and which script is the culprit. Anyone have any ideas on how I can do this?

    Read the article

  • Attaching 3.5" desktop drive to MacBook SATA

    - by Kyle Cronin
    I have a mid-2007 MacBook that, according to the Apple Store, has suffered some liquid damage and requires a new logic board to operate correctly, a ~$750 repair I've been told (would normally be around ~$300 were it not for the "liquid damage"). The unit itself works fine - the only problem I've been having is that the system does not recognize the battery and will not charge it. Curiously, the system can still be powered by the battery and even recognizes when the power cord is detached by diming the backlight, but I digress. Now that this laptop will likely become a desktop, I'm wondering if it might be possible to attach a desktop drive. I recently purchased a 2TB SATA drive and I'm wondering if it's possible to somehow attach it where the current internal drive connects. Obviously the drive itself will not fit inside the device, but as the unit will spend the rest of its days on my desk, that's not really much of an issue. My main questions are: Is this possible? If so, how would I connect the drive? Would a SATA extender cable work? Is the SATA port on my MacBook capable of powering a desktop drive? Or should I just get a SATA male-to-female cable and see if I can power the drive through other means (a cheap power supply, for example) The disk I'm referring to is the Hitachi Deskstar HD32000. Though I couldn't find that exact model on Hitachi's support site, these are the power requirements for a similar drive, the 7K2000 (2TB, 7200RPM, SATA II): Power Requirement +5 VDC (+/-5%) +12 VDC (+/-10%) Startup current (A, max.) 1.2 (+5V), 2.0 (+12V) Idle (W) 7.5 From what I've read, 2.5" drives require 5V, meaning that my MacBook obviously is capable of producing it. The specs seem to suggest that this drive seems capable of accepting it instead of the typical 12V - is this an accurate interpretation of the power requirements? Or does it need both 12V and 5V?

    Read the article

  • Repositories for CentOS that don't suck?

    - by Keyo
    I'm used to using ubuntu/debian repositories and they are great. I can apt-get just about any package and it'll be there. I have not found this on centos. I called my hosting company and they suggest I install atomic turtle since it's compatible with cPanel. This didn't work when I tried to install git. yum install git ... No package git available Repeat the same thing for just about any package, the default repositories are pathetic. So perhaps there are other repositories I can use. Can anyone suggest any? Edit The problem was cPanel excluding some git dependencies in yum.conf. See http://www.cmdln.org/2010/05/07/install-git-on-centos-cpanel-server/

    Read the article

  • How to allow an internal server accept remote connections not through RD Gateway

    - by Matt Ahrens
    So, I help administrate a collection of servers running various windows server environments. We have a RD Gateway server, properly configured, to gatekeep for us. It does not have the other servers listed in it's server farm category, though. I just added a refurbished server for a non-profit development environment that is sharing the rack space and port. I would like this server to be accessible via remote connection, but not require RD gateway certification (I cannot add the users for this development server to our gateway since they do not work for the organization hosting the rack.) Is there any way for me to add this dev. server as an exception to which servers should require RD Gateway clearance, or otherwise let users bypass RD gateway credentials for this one machine? Thanks, and let me know if I am misinformed on how RD gateway works or anything. I am still learning.

    Read the article

  • postfix, webmin installed. whats next?

    - by Johnny Craig
    Im trying to get imap running and dont know the problem. i a developer, not a network guy.( our network guy left) we had postfix installed already for outgoing mail on 8 domains. we only had incoming on 1 domain. but that mail server is located on a different ip. now we want incoming on another domain, but we dont want it on another ip, we want it on the same ip as the website itself. I installed dovecot today because my hosting company said i needed it. it seems to run fine. do i need dovecot AND postfix? or are they the same thing? dovecot does not show up anywhere in webmin what i cant seem to figure out how to do is add a user email so i can try to telnet in on port 143. i think i have evrything installed, just need the next step.... sorry for the newb question

    Read the article

  • Xen HVM guest has severe clock drift

    - by ipartola
    I am seeing a very severe clock drift on my Xen HVM VPS, rented from a hosting provider, so I don't have access to the dom0 system. I continuously run ntpd, but the clock drifts by as much as 30 seconds in 5 minutes and NTP cannot keep up. Has anyone experienced this? Here are some details: $ dmesg | grep clock [ 0.160000] Measured 347 cycles TSC warp between CPUs, turning off TSC clock. [ 0.396000] * this clock source is slow. Consider trying other clock sources [ 0.550448] Switching to clocksource acpi_pm [ 0.653135] rtc_cmos 00:05: setting system clock to 2011-03-09 02:45:40 UTC (1299638740) $ cat /sys/devices/system/clocksource/clocksource0/available_clocksource acpi_pm $ cat /sys/devices/system/clocksource/clocksource0/current_clocksource acpi_pm

    Read the article

  • cURL looking for CA in the wrong place

    - by andrewtweber
    On Redhat Linux, in a PHP script I am setting cURL options as such: curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, True); curl_setopt($ch, CURLOPT_CAINFO, '/home/andrew/share/cacert.pem'); Yet I am getting this exception when trying to send data (curl error: 77) error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none Why is it looking for the CAfile in /etc/pki/tls/certs/ca-bundle.crt? I don't know where this folder is coming from as I don't set it anywhere. Shouldn't it be looking in the place I specified, /home/andrew/share/cacert.pem? I don't have write permission /etc/ so simply copying the file there is not an option. Am I missing some other curl option that I should be using? (This is on shared hosting - is it possible that it's disallowing me from setting a different path for the CAfile?)

    Read the article

  • Comprehensive solution for managing patches, event viewing, change management, inventory, etc

    - by Holocryptic
    I'm looking for a solution that incorporates most or all of the following: Patch Management, Server event viewing/tracking, AD change management, ticketing and internal/external kb, remote access - ability to shadow user sessions or create new ones, imaging, and inventory. Our environments contains Windows Servers and ESXi Hosts (We're not completely virtual, but we're moving that direction). Various Cisco and Linksys switches and firewalls. This is a tall order, and I don't know if it can be done on a reasonable budget. I've looked and found some questions on SF that deal with some of this: http://serverfault.com/questions/72015/active-directory-management-tools-for-medium-sized-forest-less-than-1000-users http://serverfault.com/questions/4021/are-there-any-tools-to-do-change-management-with-active-directory-group-policy http://serverfault.com/questions/21752/what-is-a-good-patch-update-management-server What I'm ideally looking for is a reasonably cheap solution that integrates the features into a central interface. We're a non-profit, so money is a limiting factor (the cheaper, the better; but we have a max of $15k). What we are trying to avoid is having to deal with multiple vendors, while maintaining scalability (we're creating more sites that we'll have to manage). Is this possible, or will we have to cobble together something to make it work for us?

    Read the article

  • Questions about Domains and DNS

    - by ShoX
    Hi, I am totally new to the DNS and server hosting world and not quite sure what I need. I want to get a domain, forward it to my own server, so that the user sees example.com in the url bar and example.com/foo/bar will work. Depending on what subdomain it is, it should do different things (another base-directory at webserver, ftp, etc). Also my email should be able to be sent to and received by that server. What irritates me, is the fact, that in the A-record I can only list IP-addresses and no ports. So do I have to set up a nameserver on my own server? Or do I accomplish this via vhosts on my webserver? I would appreciate any help or link to a tutorial. I know how DNS works, know some basic apache-stuff, etc... so no need to explain that. Thanks

    Read the article

  • SQL 2008 R2 3rd Party Peer-to-Peer Replication, Global Site Distribution

    - by gombala
    We are looking at hosting 3 globally distributed SQL Server installations at different data centers. The intent is that Site A will serve web traffic and data for a specific region, same with Site B and C. In the case that Site A data center goes down, looses connectivity, etc. the users of Site A users will fail over to Site B or C (depending which is up). Also, if a user from Site A travels to Site C they should be able to access their data as it was on Site A. My questions is what SQL replication technology (SQL Replication or 3rd party) can support this scenario? We are using SQL 2008 R2 Enterprise at each site, each site runs on top of VMWare with a Netapp filer. Would something like distributed caching help in this scenario as well? We have looked at and tested Peer-to-Peer replication but have encountered issues with conflicts during our testing. I imagine there are other global data centers that have encountered and solved this issue.

    Read the article

  • Uninstall SQL Server 2005 Express after Demoting the DC

    - by Walter Aman
    A Windows Server 2003 SP2 hosting a now orphan installation of SQL 2005 Workgroup was pressed into service as a DC in a disaster recovery scenario. It has since been demoted. The server also hosts legacy apps for which we lack reinstallation resources; thus our desire to preserve it as close to intact as possible while removing the orphaned roles. All efforts to remove SQL 2005 thru Control Panel and ARPWrapper /remove fail with error 29528. Should I abandon this and leave the orphan SQL dormant, or is it reasonable to remove it post-demote?

    Read the article

  • Index a low-cost NAS on Windows 7

    - by JcMaco
    Has anyone found a way to index the files stored on a Networked Attached Storage on Windows 7 so that the files can be available in Windows Search and Libraries? I am referring to the cheap and available NAS like the Western Digital My Book series that use an embedded linux server. Similar question: http://windows7forums.com/windows-7-networking/6700-indexing-nas-drive-libraries.html EDIT Windows help proposes to make the files stored on the NAS available offline. This is obviously not a good solution if the NAS has more data than what the client can store. If the folder is on a network device that is not part of your homegroup, it can be included as long as the content of the folder is indexed. If the folder is already indexed on the device where it is stored, you should be able to include it directly in the library. If the network folder is not indexed, an easy way to index it is to make the folder available offline. This will create offline versions of the files in the folder, and add these files to the index on your computer. Once you make a folder available offline, you can include it in a library. When you make a network folder available offline, copies of all the files in that folder will be stored on your computer's hard disk. Take this into consideration if the network folder contains a large number of files.

    Read the article

  • Backing up mail accounts without full access to mailserver

    - by Agos
    Hi everybody. I'm in the process of migrating some stuff from a (crappy) hosting. Files were easy with SSH access, but mail is giving me some thoughts. This is the situation: qmail server, no ssh access I own postmaster account accounts are accessible via web interface or POP3 I'm interested in transferring emails, but if whole accounts can be transferred it'd be better. Being POP3 I'm fairly confident every message has been downloaded, but of course I'd like to download the whole thing to be safer. Right now I have this in mind: Enter in web admin Change each account's password (it's only a dozen or so accounts so still feasible) Send new password to user telling him please not to change it getmail or something like that put on new IMAP server in some way (which I still haven't planned) But I feel there should be a better way to do this. Is there? Thanks in advance!

    Read the article

  • Windows 2008 VPS always crashes when out of disk space

    - by Pickels
    Hello, I am renting a Windows server 2008 dc SP2 VPS for hosting my Asp.Net projects. Now for the second time this month my VPS ran out of disk space. The first time it was a log file that got to big and yesterday it was my mistake for uploading a website without noticing the lack of space on my VPS. Now the side effect this has is that my VPS corrupts some files when trying to write them. Last time it was Plesk that stopped working yesterday it was IIS. So I was wondering is this normal behavior? I called my service provider to ask if they could restore a back-up and to ask if this is normal and they ensured me it was. I am not trying to blame them and I know it's mostly my fault for not monitoring my VPS better or for not setting better defaults.

    Read the article

  • SSL certs or intermediate for DMZ

    - by rex
    I've been tasked with deploying and managing load balancers covering internal servers and DMZ servers. I have no experience with this, and this is a first for my organization as well. Balancers are up, running, legit. Currently we are using a self-signed cert for Exchange/OWA. I know that we should have a cert signed by a CA, but the balancer has options for SSL cert or intermediate cert, and I'm unclear on the difference, or on which we need. We will be hosting Lync, Exchange and some custom apps in the DMZ. disclaimer: Apologies up front, I'm desktop support. I recently passed my Net+. It seems that has made me the network engineer in this organization.

    Read the article

  • How to analyse logs after the site was hacked

    - by Vasiliy Toporov
    One of our web-projects was hacked. Malefactor changed some template files in project and 1 core file of the web-framework (it's one of the famous php-frameworks). We found all corrupted files by git and reverted them. So now I need to find the weak point. With high probability we can say, that it's not the ftp or ssh password abduction. The support specialist of hosting provider (after logs analysis) said that it was the security hole in our code. My questions: 1) What tools should I use, to review access and error logs of Apache? (Our server distro is Debian). 2) Can you write tips of suspicious lines detection in logs? Maybe tutorials or primers of some useful regexps or techniques? 3) How to separate "normal user behavior" from suspicious in logs. 4) Is there any way to preventing attacks in Apache? Thanks for your help.

    Read the article

  • creating a backup system with freenas

    - by masfenix
    We are currently in the process of opening a new accounting firm in the new year (actually moving from our previous location). I am looking for a cheap/free solution to back up our files (small, text files couple of kb). I was impressed with FreeNas and Windows Backup but I found out that Windows Backup only saves for a maximum of 2 years. The work machines will be running Windows 8 or Windows 7. There can be many work machines however we have only one to start with (ie, think of it as just one employee). I have an old core 2 duo with 2 gigs of ram that I can convert to a server if need be. I want the syncing to be done through LAN since the data is confidential and should never touch the outside world. So ideally, I would like the following scenario: A skydrive/dropbox like service to sync my client files over work machines and a central server. The "server" part should store history of files (i don't know how this will be done since the file will have the same name?). This isn't really necessary, but I can see it become useful. I am not familiar with RAID, so does any software RAID solution exist? I will most likely be buying 2 hard drives.

    Read the article

  • BIOS not detecting working SATA hard drive.

    - by Evan
    Some time ago my power supply died. It's a long story from then till now, but the important bit is that I ended up with a new hard drive and a new power supply. I tested to see if my original hard drive was still alive, and it booted and worked perfectly until I turned it off. When I started it again it would not boot. I bought new SATA cables, assuming that the one I had was not seating properly (it was cheap and wobbly), but no dice. Upon start-up I am presented with a message telling me to insert boot media into the selected drive or add a drive and restart. Neither the new or the old drive is detected by BIOS, my Vista install disk, or from my bootable Linux USB drive. When I remove all of the RAM the computer ceases outputting visual information, and upon reinstalling the ram and starting up again gives me a "failed overclock" error. So, does anyone have an idea as to what might be going on? I'm completely lost at this point.

    Read the article

  • FTP "PUT" fails from Virtual Machine, but not host PC: 504 Command not implemented for that paramete

    - by BrianH
    I have an FTP Script I'm using to automate a file transfer. The transfer works fine on my PC (XP SP2), but when I try and run it on a VM on my PC (XP SP2), the "put" commands gives off: 504 Command not implemented for that parameter. FTP File: open [ftp site] [username] [password] cd [directory on FTP server] binary hash put ..\[subfolder1]\[Subfolder2]\[subfolder3]\[filename] bye The FTP site/server is around the world, and not under my control. From what I understand of a 504, that means the command should NEVER work, but since the same script DOES work on my PC (hosting the VM), that eliminates syntax, file naming, etc. The put command when triggered from the VM, actually creates a 0 length file on the target FTP server, but doesn't populate the file.

    Read the article

  • Remote access not working without connected monitor

    - by winSharp93
    I am trying to configure a Windows Server 2008 as a Home Server for my personal use (mainly for storing documents, hosting source-control, etc.). The "server" consists of an Intel Atom 2700DC board and an Intel SSD. Configuring remote access to the server, I am confronted with a very strange problem: As long as a monitor is connected to my server, remote access works without any problems. However, when no monitor is connected at boot-time, remote access simply won't work (I keep getting errors when trying to connect that the remote server was not found or that remote access is disabled). Windows definitely boots when no monitor is connected as I receive a message asking me whether to enter safe mode when booting after powering the server down by plugging the power cord. When I plug in a monitor after boot, it stays turned off and remote desktop connections still fail. Do you have any ideas about what I could try?

    Read the article

  • php include path problem:Same code works on Ubuntu default Apache and php conf, but not on CentOS

    - by Neo
    So the same code works on my ubuntu server but when I upload it to my dedicated hosting server running CentOS it seems to add an extra prefix of .:/usr/share/pear:/usr/share/php: I tried setting includepath to different things but it just doesn't work. the file is in a directory called language in the same folder as the file that is including it and I'm using : include dirname(FILE).DIRECTORY_SEPARATOR."language".DIRECTORY_SEPARATOR."storage.inc"; and include dirname(__FILE__)."/language/language.php"; and include "language/language.php"; and alot of other combinations but I can't get it to find the file. Fatal error: require_once() [function.require]: Failed opening required '/home/neo/public_html/migration/include/class/core/storage.inc' (include_path='.:/usr/share/pear:/usr/share/php:/home/neo/public_html/migration') in /home/neo/public_html/migration/include/class/core/class_lang.inc on line 153

    Read the article

  • What's required to enable communication between two IP ranges located behind one switch?

    - by Eric3
    Within our co-located networking closet, we have control over two ranges of 254 addresses, e.g. 64.123.45.0/24 and 65.234.56.0/24. The problem is, if a host has only one IP address, or a block of addresses in only one range, it can't contact any of the addresses in the other subnet. All of our hosts use our hosting provider's respective gateway, e.g. 64.123.45.1 or 65.234.56.1 A host on the 64.123.45.0/24 range can contact the 65.234.56.1 gateway and vice-versa Everything in our closet is connected to an HP ProCurve 2810 (a Layer 2-only switch), which connects through a Juniper NetScreen-25 firewall to the outside world What can I do to enable communication between the two ranges? Is there some settings I can change, or do I need better networking equipment?

    Read the article

  • Explaining Git to someone new to revision control

    - by MaxMackie
    I've recently decided to jump into the whole world of revision control to work on some open source projects I have. I looked around (subversion, mercurial, git, etc) and found that Git seemed to make more sense conceptually to me. I've set everything up on my computer (opensuse) and made an account on gitorious (let me know if there is a more simple/better hosting provider). I understand Git from a conceptual point of view (work locally, commit to a local repo, others can now checkout from you, right?). But where does gitorious come into play? I commit to them as well as committing locally? Apart from conceptually, I don't quite understand HOW it works when it comes to making a local repository and running git init inside a folder and that HEAD file. Keep in mind I have never used any form of revision control ever before. So even the most basic concepts are foreign to me. As I post this, I'm also reading up and trying to figure it out myself.

    Read the article

  • Is reliability reputation of mechanical keyboards overblown?

    - by Rarst
    A while back I worked up to finally buying mechanical keyboard (~$100 range, "black" switches) and was initially quite content with purchase. However just outside first year (read it - as soon as warranty expired) it started to develop repeat issues (press once, get chain of letter repeated) on multiple keys. It doesn't react to generic cleaning (up to compressed air) and searching Internet shows noticeable amount of people with similar-to-identical issues, spanning years. This makes me severely hesitant to buy another mechanical keyboard, considering: every other keyboard I ever owned, including ultra-cheap crap managed to last longer than that typing experience is nice, but not lifechanging-fan-forever nice for me my choice of mechanical keyboards is severely limited not many brands represented in local market and primarily crazy looking gamer models russian (not to mention russian and ukrainian if possible) layout excludes international ordering price tag for a meek year of use I got our of it is plain demoralizing It is obvious mechanical keyboards have their fans, but shopping around for "best fit" or getting into multiple hundreds price tags is probably not something I am highly interested in. Considering my constraints and bad experience with reliability, is it practical for me to sink more money into buying mechanical keyboard(s) again? In other words - manufacturers are beaming about how crazy reliable mechanical keyboards are. Are active long time users of such keyboards confidently of same opinion?

    Read the article

  • Outlook signature distribution tools ?

    - by HannesFostie
    Hi We are soon changing our corporate identity, and as such we will need to change our outlook signatures. However, being some 125 people, my colleague sysadmin and I don't want to go around changing these manually, and are thus looking for a good way to do this fully automated. Most of our desktops are XP, with the exceptional few running Win7. Most run Outlook 2007, some run 2003. Our environment is AD-centered, and most of the information will come from AD (telephone number, title, ...). The biggest problem I can see so far is that because we are bilingual (Dutch and French), there will be 2 versions of the signature, depending on what the person has as main language. People currently do not have anything in AD to distinguish this, but we could create a group for it, or perhaps add some sort of attribute. A cheap if not free tool would be great. eMailSignature could probably do most, if not all, of this for us but it's a rather expensive tool costing some 1250 euro. We just want to distribute the signatures, actual "management" is less important as job titles don't change all that much. Any tips are welcome!

    Read the article

< Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >