Search Results

Search found 34016 results on 1361 pages for 'static content'.

Page 625/1361 | < Previous Page | 621 622 623 624 625 626 627 628 629 630 631 632  | Next Page >

  • mysqldump trigger crashed tables

    - by m4rc
    We had a database crash this morning starting at 1 minute past midnight (when the database backup runs). The exception emails I was getting said "Table './core/content' is marked as crashed and should be repaired". My question is basically, can mysqldump cause tables to crash, if so how and why? And are there any tools which can detect a crashed table and run a repair on it? Thanks in advance.

    Read the article

  • IIS7 doesn't monitor changes across symlinks

    - by Matt Hensley
    I've used the mklink utility to create a symlink to a directory of web content. IIS7 doesn't "see" changes any classic ASP files in this linked directory without issuing an iisreset. I've disabled caching and file changes are picked up on other static files (such as .html) but .asp files are ignored.

    Read the article

  • UNIX tool to dump a selection of HTML?

    - by jldugger
    I'm looking to monitor changes on websites and my current approach is being defeated by a rotating top banner. Is there a UNIX tool that takes a selection parameter (id attribute or XPath), reads HTML from stdin and prints to stdout the subtree based on the selection? For example, given an html document I want to filter out everything but the subtree of the element with id="content". Basically, I'm looking for the simplest HTML/XML equivalent to grep.

    Read the article

  • Nginx fastcgi problems with django (double slashes in url?)

    - by wizard
    I'm deploying my first django app. I'm familiar with nginx and fastcgi from deploying php-fpm. I can't get python to recognize the urls. I'm also at a loss on how to debug this further. I'd welcome solutions to this problem and tips on debugging fastcgi problems. Currently I get a 404 page regardless of the url and for some reason a double slash For http://www.site.com/admin/ Page not found (404) Request Method: GET Request URL: http://www.site.com/admin// My urls.py from the debug output - which work in the dev server. Using the URLconf defined in ahrlty.urls, Django tried these URL patterns, in this order: ^listings/ ^admin/ ^accounts/login/$ ^accounts/logout/$ my nginx config server { listen 80; server_name beta.ahrlty.com; access_log /home/ahrlty/ahrlty/logs/access.log; error_log /home/ahrlty/ahrlty/logs/error.log; location /static/ { alias /home/ahrlty/ahrlty/ahrlty/static/; break; } location /media/ { alias /usr/lib/python2.6/dist-packages/django/contrib/admin/media/; break; } location / { include /etc/nginx/fastcgi_params; fastcgi_pass 127.0.0.1:8001; break; } } and my fastcgi_params fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param REQUEST_URI $request_uri; fastcgi_param DOCUMENT_URI $document_uri; fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx/$nginx_version; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; fastcgi_param PATH_INFO $fastcgi_script_name; # PHP only, required if PHP was built with --enable-force-cgi-redirect fastcgi_param REDIRECT_STATUS 200; And lastly I'm running fastcgi from the commandline with django's manage.py. python manage.py runfcgi method=threaded host=127.0.0.1 port=8080 pidfile=mysite.pid minspare=4 maxspare=30 daemonize=false I'm having a hard time debugging this one. Does anything jump out at anybody? Notes nginx version: nginx/0.7.62 Django svn trunk rev 13013

    Read the article

  • Does hosting multiple sites on one server hurt your SEO?

    - by MarathonStudios
    I have a handful of (content unrelated) sites with decent PRs and I'm considering hosting them all on the same server. I've heard that if you do this, internal linking between two seperate domains on that server may be seen as less "valid" by Google in PageRank terms (since you obviously own both of the sites as they share an IP address). Anyone have any experience in this? I'd love to save some hosting cash by consolidating, but not at the expense of losing the ability to link my sites together powerfully.

    Read the article

  • In-Page RSS Reader (Flash? Javascript?)

    - by Jonathan Sampson
    Has anybody ever seen any (no-installs-necessary) solutions to listing any RSS feed on any page of a website? Ideally it would consist of HTML (javascript if necessary) and require no downloads or installs. I am thinking of twitter-style apps that you load up in an iframe or via Javascript and in turn they show your latest tweets on the page - same concept, different content. Just looking for a shiny gadget, not able to write my own solution for this particular project.

    Read the article

  • amazon ec2 assign domain name

    - by user41999
    1.amazonaws doesnt provide dns service? 2.i can only assign static ip through ec2 so the only way to assign domain name is to use third party dns service? which do you all recommend? i need one that able to add SRV

    Read the article

  • Burn 30GB zip file to DVD

    - by Joel Coehoorn
    I have a zip 30GB zip file containing an archive a digital materials available in the school library that I want to burn to dvd. Of course, 30Gb is far too large for a single dvd and the content is already zipped. I'm open to ideas, but leaning towards suggestions that will help me automatically spread the file over multiple dvds, including a simple program to stitch it back together again later.

    Read the article

  • conditional mod_deflate based on headers

    - by Ben K.
    mod_deflate seems pretty sweet. I'd love to turn it on across the board for text/html--but for certain pages, I don't want to gzip since upstream proxies need to be able to inspect the content. I know there's an AddOutputFilterByType directive -- is there any way to combine that w/ a header inspect so that if I see X-NO-COMPRESS true I skip mod_deflate?

    Read the article

  • Beginner server local installation

    - by joanjgm
    Here's the thing I own a small business and currently my emails are being managed by some regular hosting using cpanel and that I bought a small server and installed windows server and exchange Can you tell what I did wrong here Installed and configured my current existing domain Configured all email address Installed noip in case my public address change In the cpanel of the domain I've added an MX record to the noip domain of the server with priority 0 so now emails are being received by my own server Now whenever I send an email to anyone gmail hotmail etc I get a response that cannot be delivered since may be junk This didn't happen when I sent emails from the hosting What's missing what did I do wrong heres the code mx.google.com rejected your message to the following e-mail addresses: Joan J. Guerra Makaren ([email protected]) mx.google.com gave this error: [186.88.202.13 12] Our system has detected that this message is likely unsolicited mail. To reduce the amount of spam sent to Gmail, this message has been blocked. Please visit http://support.google.com/mail/bin/answer.py?hl=en&answer=188131 for more information. cn9si815432vcb.71 - gsmtp Your message wasn't delivered due to a permission or security issue. It may have been rejected by a moderator, the address may only accept e-mail from certain senders, or another restriction may be preventing delivery. Diagnostic information for administrators: Generating server: SERVERMEGA.megaconstrucciones.com.ve [email protected] mx.google.com #550-5.7.1 [186.88.202.13 12] Our system has detected that this message is 550-5.7.1 likely unsolicited mail. To reduce the amount of spam sent to Gmail, 550-5.7.1 this message has been blocked. Please visit 550-5.7.1 http://support.google.com/mail/bin/answer.py?hl=en&answer=188131 for 550 5.7.1 more information. cn9si815432vcb.71 - gsmtp ## Original message headers: Received: from SERVERMEGA.megaconstrucciones.com.ve ([fe80::9096:e9c2:405b:6112]) by SERVERMEGA.megaconstrucciones.com.ve ([fe80::9096:e9c2:405b:6112%10]) with mapi; Thu, 29 May 2014 11:32:19 -0430 From: prueba <[email protected]> To: "Joan J. Guerra Makaren" <[email protected]> Subject: Probando correos Thread-Topic: Probando correos Thread-Index: Ac97V1eW4OBFmoqJTRGoD7IPTC2azg== Date: Thu, 29 May 2014 16:04:35 +0000 Message-ID: <[email protected]> Accept-Language: en-US, es-VE Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: Content-Type: multipart/alternative; boundary="_000_000f42494487966276f7b241megaconstruccionescomve_" MIME-Version: 1.0

    Read the article

  • How can I read out internal pdf creation/modified date with Windows PowerShell?

    - by Martin
    PDF files seem to have a separate set of file properties which contain (among others) a creation date and a modified date (see screenshot here: http://ventajamarketing.com/writingblog/wp-content/uploads/2012/02/Acrobat-Document-Properties1-300x297.png). Those date obviously can differ from the creation and modified date shown in the file system (Windows Explorer). How can I access the date information in the PDF file and read it out in Windows 7 with Windows PowerShell (or maybe another method)?

    Read the article

  • Apache mod_proxy dynamic filter

    - by jrhicks
    How can I configure Apache to ProxyBlock content based on something dynamic such as time-of-day or max-use. Basicly I'm curious about the scriptability of Apache. My web-stumbling leads me to believe I can combine mod-proxy and mod-perl in interesting ways to do dynamic filtering. But I'm pretty lost. What are some general instructions, tutorials, books, technologies to begin scripting Apache (or any suitable proxy).

    Read the article

  • nginx url rewrites for php-urls

    - by Tronic
    i have to permament redirect some old urls in nginx. the old urls are old-style php urls including a parameter for loading content. they look like this: http://www.foo.com/index.php?site=foo http://www.foo.com/index.php?site=bar i want to redirect them to other urls like: http://www.foo.com/news http://www.foo.com/gallery any advice on how i can achieve this? my tries failed. thanks in advance!

    Read the article

  • Blocking specific IP requests

    - by user42908
    Hi, I own a VPS running Ubuntu with Apache stuff. Recently I am getting continous request from IP static-195.22.94.120.addr.tdcsong.se.54303 : 12337 I already installed the 'arno-iptables-firewall'. Have iptables blocking 195.22.94.120 Still then I get the request from that IP if i see via tcpdump. May I know what else i can do to protect my VPS? Thank you.

    Read the article

  • How to use rsync when filenames contain double quotes?

    - by wfoolhill
    I am trying to synchronize the content of the directory my_dir/ from /home to /backup. This directory contains a file which name has a double quote in it, such as to"to. Here is my rsync command: rsync -Cazh /home/my_dir/ /backup/my_dir/ And I get the following message: rsync: mkstemp "/backup/my_dir/.to"to.d93PZr" failed: Invalid argument (22) For info, rsync works well when the synchronized filenames contain single quote, parenthesis and space. Thus, why is it bugging with a double quote? Thanks for any help.

    Read the article

  • Localhost service accessible from remote address

    - by dynback.com
    I have on my home Windows box - Cassini server with localhost:10000. And I want it be accessible in internet by my static IP. Tried netcat, "nc -l -p 10001 localhost 10000". But it results in "invalid connection to [IP] from [IP] 16074" Also before that it was working on Opera Unite properly, but now only writes a message: "An error occured. See error log for details". I dont know where to get that log.

    Read the article

  • Nginx: bug using if in location, how do I rectify

    - by Quintin Par
    I am using nginx in reverse proxy mode. In my server section I have this code to set expire and cache control of my static files. location ~* ^.+\.(css|js|png|gif)$ { access_log off; expires max; add_header Cache-Control public; if (!-f $request_filename) { proxy_pass http://localhost:82; } } This is quite obviously creating issues. Can someone help me correct this code to use try_files or rewrite?

    Read the article

  • Mails are coming from mail server to my account automatically...???

    - by Jayakrishnan T
    Hi all, i am getting same mail from admin account of my mail server every day automatically.The content of the mail is given below. Click here to access your spam quarantine. The spam quarantine contains emails that are being held from your email account. Quarantined emails can be released to your inbox or deleted using the spam quarantine link. Please give me an advice to solve this problem.

    Read the article

  • Recover not properly burned DVD from camcoder

    - by tomo
    Can anybody suggest me any good and preferably free software - working on Vista / 7 - for recovering content from DVD disks? A few DVD-R VOB files cannot be read from disk by Windows. Probably the camera failed to burn it correctly. What I want to achieve is to skip a few invalid frames in VOB files and recreate proper MPEG stream - without re-encoding whole stream and loosing the quality.

    Read the article

  • Configure server on network to analyze traffic

    - by Strajan Sebastian
    I have the following network: http://i.stack.imgur.com/rapkH.jpg I want to send all the traffic from the devices that connect to the 192.168.0.1 router to the 192.168.10.1 router(and eventually to the Internet), by passing through the server and an additional router. Almost 2 days have passed and I can't figure what is wrong. While searching on the Internet for some similar configuration I found some articles that are somehow related to my needs, but the proposed solutions don't seem to work for me. This is a similar article: iptables forwarding between two interface I done the following steps for the configuration process: Set static IP address 192.168.1.90 for the eth0 on the server from the 192.168.1.1 router Set static IP address 192.168.0.90 for the eth1 on the server from the 192.168.0.1 router Forwarded all the traffic from 192.168.0.1 router to the server on eth1 interface witch seems to be working. The router firmware has some option to redirect all the traffic from all the ports to a specified address. Added the following rules on the server(Only the following, there aren't any additional rules): iptables -t nat -A POSTROUTING -o eth1 -j MASQUERADE iptables -A FORWARD -i eth1 -o eth0 -m state -–state RELATED,ESTABLISHED -j ACCEPT iptables -A FORWARD -i eth0 -o eth1 -j ACCEPT I also tried changing iptables -A FORWARD -i eth1 -o eth0 -m state -–state RELATED,ESTABLISHED -j ACCEPT into iptables -A FORWARD -i eth1 -o eth0 -j ACCEPT but still is not working. After adding the following to enable the packet forwarding for the server that is running CentOS: echo 1 /proc/sys/net/ipv4/ip_forward sysctl -w net.ipv4.ip_forward = 1 After a server restart and extra an extra check to see that all the configuration from above are still available I tried to see again if I can ping from a computer connected to 192.168.0.1/24 LAN the router from 192.168.1.1 but it didn't worked. The server has tshark(console wireshark) installed and I found that while sending a ping from a computer connected to 192.168.0.1 router to 192.168.1.1 the 192.168.0.90(eth1) receives the ping but it doesn't forward it to the eth0 interface as the rule tells: iptables -A FORWARD -i eth1 -o eth0 -j ACCEPT and don't now why this is happening. Questions: The iptables seem that don't work as I am expecting. Is there a need to add in the NAT table from iptables rules to redirect the traffic to the proper location, or is something else wrong with what I've done? I want to use tshark to view the traffic on the server because I think that is the best at doing this. Do you know something better that tshark to capture the traffic and maybe analyze it?

    Read the article

  • gzip compression good or bad?

    - by WarDoGG
    I have a server that currently does a lot of processing in my application and the target users are those who have a very good internet connection. The output that is sent from the server is always text/html and we do not use any media (audio/video) only images (static site images like logo,etc). We are experiencing severe performance issues and I wonder if turning off gzip/mod_deflate on the server so that the server would avoid compressing the output. Will this cause an improvement in performance?

    Read the article

  • Force www. on multi domain site and retain http or https [closed]

    - by John Isaacks
    I am using CakePHP which already contains an .htaccess file that looks like: <IfModule mod_rewrite.c> RewriteEngine on RewriteRule ^$ app/webroot/ [L] RewriteRule (.*) app/webroot/$1 [L] </IfModule> I want to force www. (unless it is a subdomain) to avoid duplicate content penalties. It needs to retain http or https Also This application will have multiple domains pointing to it. So the code needs to be able to work with any domain.

    Read the article

< Previous Page | 621 622 623 624 625 626 627 628 629 630 631 632  | Next Page >