Search Results

Search found 25493 results on 1020 pages for 'custom wordpress pages'.

Page 640/1020 | < Previous Page | 636 637 638 639 640 641 642 643 644 645 646 647  | Next Page >

  • Is there a simple way to backup and restore all Microsoft SQL Server database objects related to a p

    - by Nathan Hartley
    I would like to backup, not only the databases that belong to a particular application living on a shared server, but also, those things that get stored outside of the database; the server accounts, jobs, maintenance plans and whatever else I can't think of at the moment. This backup should be complete enough that it's corresponding restore will recreate the entire application on a different SQL server. This seems like a problem others must have dealt with in the past. So before I embark on creating custom Powershell scripts for each application, I have come to ask you... Can you help?

    Read the article

  • Internet Connectivity Issue

    - by MastaChief11
    Lately, I have been having issues connecting to the internet on one of my computers. The computer I am using now, however, is on the same network as the computer I am posting from. The issues seemed to randomly start about 2 days ago, and the only thing that seemed to fix the issue was to connect to Hotspot Shield VPN. I do not get any yellow warning signs by the Wi-Fi icon in the taskbar. I am sometimes able to use Google search, but I can never go to a website. I am also not able to re-install Hotspot Shield or update Flash because they have to connect to their company's servers. I have tried other VPN services just to see if it would fix anything, but as I expected, nothing changed. I am unsure of how I can fix the issue, and I appreciate all help given. I am running Windows 7 64 Bit Pro on a custom-built computer. Thanks.

    Read the article

  • Reverse proxy 502 bad gateway

    - by Brian Graham
    I have setup a subdomain to proxy my plesk panel, but when saving pages I am getting 502 Bad Gateway error instead of a completion message. I am running CentOS 6. Here is my vhost.conf configuration for http://plesk.domain.tld/: RewriteEngine On RewriteCond %{SERVER_PORT} ^80$ RewriteRule $ https://plesk.domain.tld/ [R,L] Here is my vhost_ssl.conf configuration for https://plesk.domain.tld/: SSLProxyEngine On <Location /> ProxyPass https://localhost:8443/ ProxyPassReverse https://localhost:8443/ </Location> I have more than enough (and I have even checked) RAM, CPU and HDD. There are no spikes. As well, the posted information does save, it just errors when trying to show me a "This information has been saved." green/red block. Here is the relevent error from /var/log/nginx/error.log (IP/Host Filtered): 2014/05/29 02:42:41 [error] 8046#0: *402 upstream prematurely closed connection while reading response header from upstream, client: 173.238.XX.XX, server: plesk.domain.tld, request: "POST /smb/web/edit HTTP/1.1", upstream: "https://198.100.XX.XX:7081/smb/web/edit", host: "plesk.domain.tld", referrer: "https://plesk.domain.tld/smb/web/edit"

    Read the article

  • Dsquery nested groups

    - by Doctor Trout
    Hi there, How would I write a dsquery to get a list of all the members of a d-list, expanding any nested groups to get the members of those groups? I've written this: dsquery * -filter "(&(memberOf=cn=...))" -r -limit 0 -attr CUSTOMFIELD sAMAccountName displayName > export.txt but returns nested d-lists and I want to expand these. I then tried this: dsquery group -samid "NAME | dsget group -members -expand > export.txt But this just lists the OU of each member and I want to get the Account Name and a custom field returned. Is there any way, either of chosing which fields to return from dsget or to epxand dsquery to show nested group membership? Thanks.

    Read the article

  • What PSU is usually used in mini-ITX cases/chassis?

    - by Subaru Tashiro
    The mini-ITX computer will be a general use computer. Not a dedicated HTPC or Home server. In general use mini-ITX cases, what PSU form factor is usually used? I understand that some case manufacturers provide custom built PSU to fit their case but I prefer to get the ones that use a PSU that follows standard form factors in case a replacement is needed. For example, what PSU fits into general purpose cases by Lian Li? Am I to assume that smaller PSU form factors also affect the possible maximum output?

    Read the article

  • when to use squid on server side?

    - by ajsie
    so i have set up apache serving my php pages. i read about squid but don't understand why/how i should use it to speed up my web server. from what i've learned squid is located in same network (or another) and caches content requested by the web browsers, and then when another web browser wants a same page, squid returns that page cached locally, so it never sends a request to the apache server (faster response time for the client, and reduced load for the server). so it seems that squid is for the client side (web browser), and has nothing to do with the server side (apache). but then some people tell others how they have speeded up apache using squid. so im confused. could squid be used on the server side too? and how will it work?

    Read the article

  • External routing for local interfaces in a virtualized network

    - by Arkaitz Jimenez
    Current setup: br0| |-- tun10 -pipe-tun0(192.240.240.1) |-- tun11 -pipe-tun1(192.240.240.2) |-- tun12 -pipe-tun2(192.240.240.3) The pipe program is a custom program that forwards data back2back between two tun interfaces. The idea is puting 2 programs in .2 and .3 while keeping .1 as the local interface in the current machine. The main problem is that I want to route packets to .2 and to .3 through .1 and br0, but as they are local interfaces, the kernel ignores any routing instruction, it just delivers the packet to the proper interface. Tried iptables, but the nat table doesn't even see ping packets to those ifaces. A "ping 192.240.240.2" delivers a icmp packet with source and dest .2 to tun1, ideally it should deliver a source .1 dest .2 at tun1 through tun0-br0-tun1 Any hint? Here the output of some commands: Output

    Read the article

  • Use apt-get source on a debian repo without using /etc/apt/source.list

    - by Erwan Queffélec
    I'm trying to use apt-get source as a regular user on a debian squeeze system. I want to retrieve the sources for cyrus-imapd-2.4 from the testing/wheezy repository. apt-get source works without root privileges; however, it seems there is no way to get apt-get to fetch anything from a repository that is not in /etc/apt/sources.list. Is there any command line option, alternate sources.list file, environment variable that will get apt to work with a custom repository ? I do have root access so I could change the /etc/apt/sources.list, however I really do not want to do that for a number of reason.

    Read the article

  • How would I put together a site requiring several TB? [closed]

    - by acidzombie24
    Lets say I have a site with unmetered 100MBPS bandwidth (i assume its bits?) and the ram i require. Most plans i see offer HDD that hold 250gb and 1TB. But what happens if i compile/generate enough data that i require 10tb or 25tb? (I'd likely have two servers but...) I wouldn't be serving all of that data (well not to the public) so CDN wouldn't make sense. What do i do in this scenario? Do I need to get a custom plan from a hosting provider? (if so how do i find them?) Are there services that allow me to mount remote drives (that sounds wrong unless its a CDN so maybe not). Are there host that deals specifically with unmetered bandwidth and provides lots of disk space? Math says ~1TB is the most i'll ever need but if i happen to need more i'd like to know my options.

    Read the article

  • Hardware upgrade: Windows 7 bluescreens, Vista loads

    - by Daniel Schaffer
    I just did a fairly significant hardware upgrade while keeping my hard disks. The old system was a dell Optiplex 745 with an Intel Core 2 duo, LGA 775. The new system is custom built, Intel i5 750. I know you're supposed to do a clean install with a hardware upgrade like this, but I'd had success in the past doing the stealth hardware upgrade like this, so I figured I'd give it a shot. Windows 7 Ultimate 64 bit gets through the loading screen and immediately blue screens and reboots. Windows Vista Home Premium 32-bit, which I have on an old hard drive from an AMD box (!!) loads up fine. I ran through the windows memory checker just to be sure, and my memory is fine. So, is the BSOD the result of some sort of protection mechanism specific to Windows 7? Is there any hope of salvaging that install?

    Read the article

  • Accessing the Internet via browser

    - by ucas
    I am on Windows 7 and using Firefox browser. I am using WiFi, but since the morning I cannot access the Internet via the browsers (Firefox, Chrome, or IE). The laptop shows there is Internet connection, Skype is online, but I can't reach the Internet. Then I launched Tor application which creates secure channel and provides its Firefox browser. Well, I can now access the Internet over that browser. So, what might be the problem causing this malfunction? The error: The connection has timed out The server at mail.google.com is taking too long to respond. The site could be temporarily unavailable or too busy. Try again in a few moments. If you are unable to load any pages, check your computer's network connection. If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the Web. Best regards

    Read the article

  • Command-line HTTP crawler for Windows?

    - by Pekka
    Would somebody have a recommendation for a web site crawler that can be invoked and equipped with settings from the command line? This would need to run in a Windows environment. Saving the data, following stylesheet links etc. is not an issue. I only need the crawler to start with a page, parse it, and follow all the links on the same domain so that in the end, all pages on the site have been requested once. Background: I'm setting up a web site that gets frequently uploaded from an office location. Combining data from various sources, it has several levels of caching. I don't want the first user to visit the site after a fresh upload to have to wait until the page has been generated and saved in the cache.

    Read the article

  • Are there videocamera which geotag individual frames?

    - by Grzegorz Adam Hankiewicz
    I'm looking for a way to record live video with the specific requirement of having each frame georeferenced with GPS. Right now I'm using a normal video camera with a PDA+GPS that records the position, but it's difficult to sync both of these plus sometimes I've forgotten to turn the PDA+GPS or it has failed for some reason and all my video has been useless. Using google I found that about two years ago a company named Seero produced such video cameras and software, but apparently the domain doesn't exist any more and I only find references of other pages mentioning it. Does somebody know of any other product? I need to record this video in HD and have some way to export to Google Maps or other GIS software the positions of the frames in a way that I can click on the map and see what was being recorded in the video at that point. The precission of the GPS tracking is good enough as one position per second, intermediate frames of the video stream can be interpolated.

    Read the article

  • What is the best OS for a server hosting a simple Ruby on Rails based pastebin

    - by Koning Baard XIV
    I have created a simple pastebin in Ruby on Rails and Python. I want to host it in an intranet and it will have like about 1000 users. I want to use one Apache server with a cluster of Mongrel servers. The server itself is a 2 GHz Intel Centrino with 2 GB RAM. What do you think is the best OS to host this? I thought about Damn Small Linux or a custom LFS system. Ubuntu servers come with loads of stuff I don't need. Maybe there are some better OSes? It must be capable of: Running Apache Running Ruby Running Python Running Mongrel with Ruby on Rails SSH Can anyone reccomend me one? Thanks. PS: I am not going to run Windows Server or Mac OS X Server (Mac's are expensive).

    Read the article

  • How can I create persistent SSH connection to "stream" commands over a period of time?

    - by Darth
    Say that I have an application running on one PC that is sending commands via SSH to another PC on the network (both machines running Linux). For example every time something happens on #1, I want to run a task on #2. In this setup, I have to create SSH connection on every single command. Is there any simple way to do this with basic unix tools without programming custom client/server application? Basically all I want is to establish a connection over SSH and then send one command after another.

    Read the article

  • clam anti-virus is slowing down my server performance

    - by Scarface
    Hey guys, I just installed clam av http://sourceforge.net/projects/php-clamav/ for scanning file uploads on my linux VPN running php. The problem is that for some reason just initiating the extension in the php ini file slows down my entire network. Regular requests such as changing pages that should take less than 1 second take 5. Has anyone ever experienced this before or have a good virus scanning alternative for scanning file uploads? extension=clamav.so [clamav] clamav.dbpath="/usr/share/clamav" clamav.keeptmp=20 clamav.maxreclevel=16 clamav.maxfiles=10000 clamav.maxfilesize=26214400 clamav.maxscansize=104857600 clamav.keeptmp=0

    Read the article

  • Website crawler/spider to get site map

    - by ack__
    I need to retrieve a whole website map, in a format like : http://example.org/ http://example.org/product/ http://example.org/service/ http://example.org/about/ http://example.org/product/viewproduct/ I need it to be linked-based (no file or dir brute-force), like : parse homepage - retrieve all links - explore them - retrieve links, ... And I also need the ability to detect if a page is a "template" to not retrieve all of the "child-pages". For example if the following links are found : http://example.org/product/viewproduct?id=1 http://example.org/product/viewproduct?id=2 http://example.org/product/viewproduct?id=3 I need to get only once the http://example.org/product/viewproduct I've looked into HTTtracks, wget (with spider-option), but nothing conclusive so far. The soft/tool should be downloadable, and I prefer if it runs on Linux. It can be written in any language. Thanks

    Read the article

  • What are the differences between the Fujitsu ScanSnap S300 and the S1300?

    - by Techboy
    Please can you tell me what the technical differences are between the Fujitsu ScanSnap S300 and the S1300 scanners? S300 = http://www.fujitsu.com/emea/products/scanners/scansnap/tmpl_scanners_scansnap-S300.html S1300 = http://www.fujitsu.com/emea/products/scanners/scansnap/tmpl_scanners_scansnap-S1300.html The only difference I can see is that the S300 has a CCD Image Sensor and the S1300 has a CIS Image Sensor. Other than that, the resolutions, pages per minute and physical sizes of both models are exactly the same. If so, is there a reason why I would want the S1300 instead of the cheaper S300? I just want a scanner for duplex document scanning, for home use (documents, receipts, etc.).

    Read the article

  • reaching 99.9999% uptime

    - by user35204
    I am currently developing a project which is mission-critical. The actual domain name is registered with 1 & 1 and I plan on purchasing DynDNS Custom DNS service (which has 5 different geographical locations for DNS) and then another secondary DNS service to make sure my DNS is as failover safe as possible. Does it matter that the registration is with 1 & 1 - are they a weak link in the chain? All I really use them for is to say that DynDNS is my primary DNS nameserver and then my secondary DNS is my other nameserver. I can transfer the registration to DynDNS - Im just not sure if it really matters or not. Thanks

    Read the article

  • Apache 2.4 and PHP 5.4 getting connection reset errors in the browser

    - by zuallauz
    In the weekend I upgraded my development web server to Apache 2.4 and PHP 5.4. In my web application which was previously working great on Apache 2.2 and PHP 5.3 it now starts getting these messages saying the "connection was reset" in Firefox. See screenshot. I am connecting to the linux machine via local LAN. I'm assuming it might be something to do with the new version of Apache or PHP, or the new LAMP stack which I downloaded from BitNami? It would seem to happen every 5-10 requests and throw this error, perhaps more likely to trigger it is if I send a POST request from a page. Is it timing out the script or something? These are just basic dynamic pages I'm loading and they worked perfectly in Apache 2.2 and PHP5.3. Here are my httpd.conf and PHP.ini if that has any clues. Any ideas? Any help much appreciated.

    Read the article

  • Mac OS X Snow Leopard: permissions changed on /var results in dns lookup issues

    - by Ivan
    I was attempting to solve an issue ("/var/log/msmtp.log: permissions denied" error when attempting to send mail using msmtp) when I did this: > chmod -R 770 /var After that, my machine would not resolve domain names via cURL. (ping also fails) But, oddly, I can enter domain names into Safari and visit any web pages w/o a problem... I'm actually not sure if the chmod command is the cause of the problem, but I suspect it is. Also, if I ls -l on /var (or /private/var) it doesn't seem that any of the subdirectories or files there actually changed permission, but there are many, so I can't say that conclusively... Incidentally, I fixed the original error (msmtp.log permission denied) by setting TMPDIR=/tmp in my local environment (bash). Now the error goes away, but I get this error: msmtp: cannot locate host domainname.org: nodename nor servname provided, or not known Any ideas about how to go about getting DNS working again?

    Read the article

  • What is the harm in giving developers read access to application server application event logs?

    - by Jim Anderson
    I am a developer working on an ASP.NET application. The application writes logging messages to the Windows event log - a custom application log just for this application. However, I do not have any access to testing or staging web/application servers. I thought an admin could just give me read access to this event log to help in debugging problems (currently a service that is working in dev is not working in test environment and I have no idea why) but that is against my client's (I'm a consultant) policy. I feel silly to keep asking an admin to look at the event log for me. What is the harm in giving developers read access to application server application event logs? Is there a different method of application logging that sysadmins prefer programmers use? Surely, admins don't want to be fetching logging messages for developers all the time.

    Read the article

  • How can I add leading zeros between delimiters in Excel 2010

    - by Gregory Biernacki
    I am trying to convert a list of property id numbers that has a standard format of 0000-A-00000-00-00, where my worksheet has the various combinations of 1-A-123 10-B-1234 Ideally they would read as follows 0001-A-00123-0000-00 0010-B-01234-0000-00 I've tried using the custom number formatting but it doesn't like the letter in the middle of the number. I didn't know if my only option was to break them apart and then put them back together again. I would accept a solution that merely put the leading zeros at the front of the number, (max is 4 characters) so the result could look like 0001-A-123

    Read the article

  • List of registered domain names

    - by Eric Chang
    Where can I find a list of domain names that I can download in a text file? I looked around and I found many sites with lists of expired domain names or domain names that will expire soon, but that's not what I need, I need a list of CURRENTLY REGISTERED domain names. The closest I found was this site (hxxp://www.list-of-domains.org/), but what I'm looking for ideally is just a plain list of registered domains and not some web links split across many pages full with ads and no text file. I know it's possible, after all how do these sites get the list of domains that they use? Where can I find such a list?

    Read the article

  • Quick Books accounting software

    - by Randall Davis
    Using a square 17" lcd monitor, and setting the Monitor to display text in larger font than default, some of the pages in QB have the bottom 2 or 3 lines chopped off. QB support has no ideas for a different/larger monitor that would solve the problem for those of us with 50 something eyes. Seems like everyone is going to the widescreen format, but when we tried one on the QB stuff, it spreads out the page to the point that it is deformed...too wide. Does anybody make a new square screen monitor in 19, 20, or bigger that would solve this issue. we can't be the only ones experiencing this.

    Read the article

< Previous Page | 636 637 638 639 640 641 642 643 644 645 646 647  | Next Page >