Search Results

Search found 28303 results on 1133 pages for 'multi site'.

Page 258/1133 | < Previous Page | 254 255 256 257 258 259 260 261 262 263 264 265  | Next Page >

  • Can too many 301 redirects cause a DNS error?

    - by Graham
    For a site http://imageocd.com that I just set up I initially spelled the category "automobiles" as "autimobiles"... I know it's rediculous. I then set up over 10,000 pages behind that category e.g. http://imageocd.com/automobiles/hillman-minx-cabrio-pictures-and-wallpapers. So, I set up over 10,000 301 url redirects to change the spelling on automobiles. I just checked my Google Webmasters report and got an error saying: http://www.imageocd.com/: Googlebot can't access your siteSep 7, 2012 Over the last 24 hours, Googlebot encountered 2 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 66.7%. Could the overabundance of 301 redirects be causing this? I host 13 sites on this dedicated server and all sites are running fine. I also contacted GoDaddy and they said the server is running fine. Any ideas on what might be going on? Also, I have "canonical" set up for every URL. Could this be part of the error? Thanks.

    Read the article

  • AdSense (reports) and custom channels

    - by RobbertT
    Please help me to further understand custom channels. As Google says it is a way to map your ads, but I still have a few questions: Is it correct that a single custom channel per 1 ad is not very useful, since you can specify Ad blocks in the AdSense reports? I have multiple Ads in multiple custom channels. After this I created 1 custom channel and added all the ads to it. I made this channel targetable, so people can target through this channel on all ads at once. Is this a good way to do it? In other words, is it possible to have ads in multiple custom channels (without targeting, just for analyzing) and then create 1 custom channel with targeting that embraces all the (desired) ads? Why is it not possible for me to analyze custom channels (or ad blocks & formats) per site in the Adsense (reports). Or am I doing something wrong? If not, I have to create different custom channels per site to see how certain ads are doing on a site level?

    Read the article

  • Host forwarding fails, server is up, domain name tests ambiguous

    - by jayunit100
    I have a domain name registered with http://www.registryrocket.com/ The "main" site, which is called rudolfcode.net, is registered under godaddy, and forwards to a heroku site (rudolfcode.herokuapp.com). I have found that the main site, rudolfcode.net works, but the hostgator forwarding has stopped working (firefox simply fails when you point to http://www.rudolflabs.com, which is the domain name registered by hostgator). How can I debug this issue ? Finally, I have tried to run some DNS tests, and here are the results : Im not sure what the failures mean .... But Im pretty sure that "Conecting to WWW Home Page" failed is a pretty bad sign ! Thanks.

    Read the article

  • SharePoint 2010 MySites - Simple explanation needed!

    - by Chris W
    I've been playing around with the 2010 beta for a couple of weeks, experimenting with topology options etc. I think I've got myself totally confused as to how it works hence if there's any SP experts out there that can explain things in simple terms for me I'd appreciate it! I want to setup a farm with 3 servers providing the content & MySites. I presume that the way to do this is to load balance or DNS round robin traffic between the 3 servers. The bit where I'm confused is that My Site Settings page asks for a specific My Site Host hence all my site traffic will be pushed to a single server even though we have 3 in the farm. If this hosts fails I presume MySites will be unavailable. Is this right? How do I configure it so that access to MySites is load balanced across the 3 servers in the farm?

    Read the article

  • Ask the Readers: How Many Monitors Do You Use with Your Computer?

    - by Asian Angel
    Most people have a single monitor for their computers, many have two, and some individuals enjoy “3 monitor plus” goodness. This week we would like to know how many monitors you use with your computer. Photo by DamnedNice. A good majority of people have a single monitor that they use with their computers and that single monitor serves their needs very well. It could be that these individuals do not engage in a heavy amount of work or play on their computers…they just need to do the basics like checking e-mail, using I.M., working with photos, etc. Another possibility is the use of virtual desktop software such as Dexpot, Yodm 3D, or Sysinternals Desktops on Windows systems. Linux systems such as Ubuntu already have that wonderful multi-desktop functionality built in. The wonderful part about virtual desktops is that a single monitor can feel equivalent to a small army of monitors. The ability to separate your open windows into “categories” and spread them out across multiple desktops is definitely nice. With each passing year dual monitor setups are becoming more common. Having twice the screen real-estate visible at the same time can be extremely convenient when you are multi-tasking. Perhaps you like to monitor your system’s stats and an e-mail account on the second monitor while working with software on the first. It certainly beats having windows popping up and down on your screen constantly while keeping on top of everything! Next we have the people who have three or more monitors in use with their computers. This may be a result of the type of work they do, an experiment to see if multiple monitors are right for them, or the cool, geeky factor that comes with having all those monitors. Needless to say these individuals can induce a good amount of envy and/or inspiration in the rest of us when we see their awesome setups. Are you perfectly content with a single monitor? Do you have two or more monitors that you use? If you have two or more monitors are they actually that useful to you? Perhaps you are getting ready even now to add additional monitors to your system. Whatever your situation may be at the moment, let us know your thoughts (and possible multi-monitor plans) in the comments! How-To Geek Polls require Javascript. Please Click Here to View the Poll. Latest Features How-To Geek ETC How to Use the Avira Rescue CD to Clean Your Infected PC The Complete List of iPad Tips, Tricks, and Tutorials Is Your Desktop Printer More Expensive Than Printing Services? 20 OS X Keyboard Shortcuts You Might Not Know HTG Explains: Which Linux File System Should You Choose? HTG Explains: Why Does Photo Paper Improve Print Quality? Hidden Tracks Your Stolen Mac; Free Until End of January Why the Other Checkout Line Always Moves Faster World of Warcraft Theme for Windows 7 Ubuntu Font Family Now Available for Download Oh No! WikiLeaks Published Santa Claus’s Naughty List [Video] Remember the Milk Now Supports HTTPS Encryption for the Entire Session

    Read the article

  • Amazon Careers website - are resumes processed in plain text format only?

    - by sapphiremirage
    The submission site has the following options: "Please upload your resume (Word Document, max size: 512 KB) OR Please copy and paste the text version of your file here", with a text box below the latter option. I went ahead and uploaded my shiny LaTeX resume (as a PDF), despite the fact that they seem to want a Word Document, and there didn't seem to be any issues. However, when I went back to edit my profile, there was no evidence that my PDF had been uploaded, other than a text version of my resume, awfully formatted and clearly stripped from the PDF, sitting in the text box below "Please copy and paste the text version of your file here". Exasperated, I did a quick and dirty copy of the text from my resume into a Word doc and uploaded that. Same result: no evidence of a file uploaded, just a stripped text version in the textbox. What I'm wondering now is, are they only going to look at the text version of my resume? If that's the case then I'm obviously going to edit it so that it looks halfway decent and doesn't contain such atrocities from the conversion as "Other Skills: LTEX". I can pretty little text files without too much effort, so this isn't that big of deal. However, my LaTeX resume is going to look better than anything I can do in plain text, so if the site is actually keeping a copy of that, then I certainly don't want to override it. Has anyone here either gone through the Amazon hiring process or interviewed candidates and know how this works? (i.e. When on site with Amazon, did the interviewers have diversely formatted resumes, or did they all look suspiciously similar)

    Read the article

  • Can I use a 302 redirect to serve up static content from an URL with escaped_fragment?

    - by Starfs
    We would like to serve up SEO-friendly Ajax-driven content. We are following this documentation. Has anyone ever tried to write a 302 redirect into the .htaccess file, that takes the ?_escaped_fragment= string and send that to a static page?, for example /snapshot/yourfilename/. How will Google react to this? I've gone through the documentation and it's not very clear. The below quote is from Google's documentation this is what I find. I'm not sure if they are saying that you can redirect the _escaped_fragment_ URL to a different static page, or if this is to redirect the hashtag URL to static content? Thoughts? From Google's site: Question: Can I use redirects to point the crawler at my static content? Redirects are okay to use, as long as they eventually get you to a page that's equivalent to what the user would see on the #! version of the page. This may be more convenient for some webmasters than serving up the content directly. If you choose this approach, please keep the following in mind: Compared to serving the content directly, using redirects will result in extra traffic because the crawler has to follow redirects to get the content. This will result in a somewhat higher number of fetches/second in crawl activity. Note that if you use a permanent (301) redirect, the url shown in our search results will typically be the target of the redirect, whereas if a temporary (302) redirect is used, we'll typically show the #! url in search results. Depending on how your site is set up, showing #! may produce a better user experience, because the user will be taken straight into the AJAX experience from the Google search results page. Clicking on a static page will take them to the static content, and they may experience avoidable extra page load time if the site later wants to switch them to the AJAX experience.

    Read the article

  • SharePoint Returning a 401.1 for a Specific User/Computer

    - by Joe Gennari
    We have a SharePoint Services 3.0 site set up supporting about 300 users right now. This report is isolated and has never been duplicated. We have one AD user who cannot log into the SharePoint site with his account from his machine and is subsequently returned a 401.1 error. If any other user tries to log on with their account from his machine, it works okay. If he moves to another machine and logs on, it works okay. The only solution to this point has been to install FireFox on the machine. When he authenticates with FF, everything is okay. Remedies tried so far: Cleared cookies/cache Turned off/on Integrated Windows Authentication in IE Downgraded IE 8 to IE 6 Removed site from Intranet Sites zone Renamed the machine Disjoined/Rejoined Domain

    Read the article

  • design question for transportation agency/workflow system

    - by George2
    I am designing a transportation agency/workflow system, and it including 3 types of people, customer who requests to transport some stuff, drivers who deliver the stuff, and truck manager who manages transport source/destination truck coordination and communicates/organizes drivers. The system is expected to be a web site, and 3 kinds of people could use the web site to submit request, accept request, monitor status of specific stuff transportation, etc. The web site is more like an open agency or a workflow system. I am wondering whether there are any existing technologies, tools or projects (better to be open source, but not a must) which I could build my application faster based on? I prefer to use .Net technologies, but not a must. Thanks in advance!

    Read the article

  • nginx caching per user agent

    - by Tuinslak
    I'm currently using nginx as reverse proxy with caching enabled. However, the main site has two different layouts, depending on the user-agent (mobile or not). I've tried something similar to this: # mobile users if ($http_user_agent ~* '(iPhone|iPod|mobile|Android|2.0\ MMP|240x320|AvantGo|BlackBerry|Blazer|Cellphone|Danger|DoCoMo|Elaine/3.0|EudoraWeb|hiptop|IEMobile)') { set $iphone_request '1'; } if ($iphone_request = '1') { proxy_cache mobile; } if ($iphone_request = '') { proxy_cache site; } proxy_cache_key "$scheme://$host$request_uri"; proxy_pass http://real-site.tld; However, nginx gives an error, stating proxy_cache can't be used in an if-structure. Any other way to serve from a different cache depending on the browser? Thanks, Tuinslak

    Read the article

  • IIS and PHP restrict IO permissions

    - by ULTRA_POROV
    I have php installed trough a fastCGI module. Is there a way to restrict the module (php.exe) read / write permissions to only the directory (+ subdirs) of the IIS site that is calling it? I need this to prevent one IIS PHP site from having access to files outside its own directory. How to do this? Is there a setting in php.ini or in the IIS configuration? I believe such a feature could exist, because when a file on the server is requested the root path of the site is also known, all it would take is that IIS passes this path to the php module, and the php module should on its end allow only IO operations within this path. PS: I know it is possible to achieve this by using a different windows account for each website, this is not an option.

    Read the article

  • Multiple SSL on same IP [closed]

    - by kadourah
    Possible Duplicate: Multiple SSL domains on the same IP address and same port? I have the following situation: first domain: test.domain.com IP: 1.2.3.4 Port: 443 SSL: Purchased from godaddy and specific to that domain Works fine no issues. I would like to add another site: test2.domain.com IP: the same Port: can be different SSL: different since I can't use the SSL above because it's specific to the site above. Now, when I add the HTTPS binding to the second site with IP:Port combination it appears to always load the first SSL ignoring the second certificate. How can I add second SSL binding to the same IP using a "different" certificate? Can this be done?

    Read the article

  • Recoomend company to take care or webserver and wordpress management?

    - by javipas
    I'm interested in setting up a professional WordPress site but I'd like to explore the pssibilities to leave the management of the webserver and even WordPress' management to a company that guarantees great availability, performance of the site (load times, security) and even SEO. My site is currently running on other platform but I plan on a migration on the next 4 weeks. I've done this usually, but I'd like to focus on the content, so I don't have to mess with webserver/mysql/php configs in order to get nice performance. Is there some (maybe hosting) company that is dedicated to this? Would it be better to hire a sysadmin with experience in those matters?

    Read the article

  • Pages partially load on rapid refresh

    - by user101570
    I recently set up a VPS slice with 256MB to run a LAMP stack (Ubuntu 11.04, Apache2, Mysql, PHP5). So far I'm only running a simple Wordpress site on an IP-based virtual host I set up. The performance is excellent, but I've noticed that if I send multiple HTTP requests from the same IP in a short time period, only partial pages are rendered. Then if I wait a bit and refresh the page, the entire page loads again. I noticed this behaviour when accessing the site from two browsers from my office desktop, but it also presents itself if I quickly navigate the site from a single browser (any browser). I'm guessing this is an Apache phenomenon, as the pages are rendered correctly except under the conditions above, but perhaps I'm wrong here. Could it be my hosting company with some kind of DOS protection in place? As a relative Linux/server noob, I'd really appreciate any insight into what settings in Apache could explain this behaviour, and how I might go about changing it.

    Read the article

  • Hosting and domain registrations for multiple clients

    - by letseatfood
    I am finally getting regular work desiging, developing, and deploying websites for small businesses and individuals. So far the websites utilize single-user content management systems, so the websites create, as far as I know, minimal load on the shared servers. I have always required that each of my clients purchase annual shared hosting at Dreamhost. For domain registration, I ask that they register with Dreamhost, but some already have a registered domain elsewhere and this is fine with me. I do this so the billing issues are the client's responsibility, not mine. My question is: Since I can register unlimited domains and connect them to my one shared hosting account at Dreamhost, should I not be requiring clients to individually pay for shared hosting and a domain? Should I actually be paying for one hosting account and then hosting all of my client's websites on that account? As I said before, I currently have each client buy their own hosting, because I feel that, for example, if there is high traffic to their site, there would be less a chance of the site going down than if their site was hosted with many others on one account. I am famous for being long-winded, please let me know if I can clarify at all. Thanks!

    Read the article

  • Hide directory contents from showing when accessing the URL directly

    - by SoLoGHoST
    On my site, if you browse to http://example.com/images/ the contents of the entire directory are shown like so: How can I make it so that this doesn't show up when people browse directly to http://example.com/images/? Can I create an .htaccess file in that directory? Or is there a better way? I really don't want people being able to do this for the entire site (i.e. every directory on that site). What can I do to prevent this? I figure it's either something that has to be done in Apache or using an global .htaccess file and placing it in the public_html folder perhaps? EDIT I diverted this using an index.php file, but I still feel that security is an issue here, how can I fix this permanently?

    Read the article

  • Summary of usage policies for website integration of various social media networks?

    - by Dallas
    To cut to the chase... I look at Twitter's usage policy and see limitations on what can and can't be done with their logo. I also see examples of websites that use icons that have been integrated with the look and feel of their own site. Given Twitter's policy, for example, it would appear that legal conversations/agreements would need to take place to do this, especially on a commercial site. I believe it is perfectly acceptable to have a plain text button that simply has the word "Tweet" on it, that has the same functionality. My question is if anyone can provide online (or other) references that attempt to summarize what can and can't be done when integrating various social networks into your own work? The answer I will mark as the correct one will be the one which provides the best resource(s) giving the best summaries of what can and can't be done with specific logos/icons, with a secondary factor being that a variety of social networking sites are addressed in your answer. Before people point to specific questions, I am looking for a well-rounded approach that considers a breadth of networks and considerations. Background: I would like to incorporate social media icons and functionality, but would like to consider what type of modifications can be done without needing to involve lawyers. For example, can I bring in a standard Facebook logo, but incorporate my site color into the logo? Would the answer differ if I maintained their color, but add in a few pixels of another color to transition? I am not saying I want to do this, but rather using it as an example.

    Read the article

  • Run single php code using multiple domains

    - by Acharya
    Hi all, I have a php code/site at xyz.com. Now I want to run the same site using multiple domains means when somebody open domain1.com, domain2.com ,domain4.com, so on urls, it should run the code/side that is at xyz.com I know one way to do this. I can host all these domains to the server where xyz.com is hosted so all domains will point to same peace of code/site. n above solution i need to hosted the domains manually. Is there any other way to do this as I want to add domains dynamically? Thanks in advance!

    Read the article

  • Combine several locations with regex in nginx

    - by AlexAtNet
    I dynamic number of Joomla installations in subfolders of the domain. For example: http://site/joomla_1/ http://site/joomla_2/ http://site/joomla_3/ ... Currently I have the follwing config that works: index index.php; location / { index index.php index.html index.htm; } location /joomla_1/ { try_files $uri $uri/ /joomla_1/index.php?q=$uri&$args; } location /joomla_2/ { try_files $uri $uri/ /joomla_2/index.php?q=$uri&$args; } location ~ \.php$ { fastcgi_pass unix:/var/run/php5-fpm/joomla.sock; ... } I'm trying to combine joomla_N rules in one: location ~ ^/(joomla_[^/]+)/ { try_files $uri $uri/ /$1/index.php?q=$uri&$args; } but server starts to return index.php as is (does not call the php-fpm). It looks like the nginx stops the processing of the regex rules after the first match. Is there any way to combine this rules with something like regex?

    Read the article

  • Which programming language should I choose I want to build this website ...? [closed]

    - by Goma
    Assuming that I will start with just phot sharing website. Every user can add comments to any photo. After that the site will contain news (general news), the admin can add any news and the moderators as well while the users can also add comments on this news. The website will aslo provide photos uploader, so every user will have up to 20 MB ti upload any photos they want. Other users can see these photos or can not depending on the option that the main user chose(if he wants to publish his photos or not). The site should have a small type of forum which provide the ability for admin to ad categories and for user to add topics and replies for each topic in these categoris. These are the things that I can think of now, but the website will add other features as well and services later on. Can you tell me now which programming language can help me to do all that? I need a programming language that provdies the follwing: 1- speed load for pages of the site. 2- easy to add more functions quickly and easy to edit code for any reason. 3- Secure 4- fast in displaying infromation from database.

    Read the article

  • IIS 7 SSO stops working during high CPU load? [migrated]

    - by DanB
    On our IIS7 site (Windows 2008 Server), we have set up single sign-on (SSO). It seems to work fine most of the time, but when the CPU load becomes high, SSO authentication completely stops working. I did some research and tried this suggestion to increase the max number of worker processes in the default app pool, but the increase did not help. Some details: The site is a WordPress blog. The server has plenty of RAM (2 GB) and free disk space. SSO is achieved by putting a copy of the WordPress login page (wp-login.php) into a subfolder below the root that has anonymous authentication disabled, and then redirecting the browser to it. This was the recommendation of Microsoft given to our consultants. To increase CPU load for testing, I have three scripts hit the home page simultaneously, over and over. This drives CPU to 100%. When these scripts are running, SSO authentication simply doesn't happen. As soon as I stop the scripts, SSO works again. (I should mention that the SSO problem also happens when many users visit the site at once....) The WordPress database process (mysqld) is not stressed at all by the scripts. I would be happy to provide further diagnostics. Any help appreciated!

    Read the article

  • Mitigating the 'firesheep' attack at the network layer?

    - by pobk
    What are the sysadmin's thoughts on mitigating the 'firesheep' attack for servers they manage? Firesheep is a new firefox extension that allows anyone who installs it to sidejack session it can discover. It does it's discovery by sniffing packets on the network and looking for session cookies from known sites. It is relatively easy to write plugins for the extension to listen for cookies from additional sites. From a systems/network perspective, we've discussed the possibility of encrypting the whole site, but this introduces additional load on servers and screws with site-indexing, assets and general performance. One option we've investigated is to use our firewalls to do SSL Offload, but as I mentioned earlier, this would require all of the site to be encrypted. What's the general thoughts on protecting against this attack vector? I've asked a similar question on StackOverflow, however, it would be interesting to see what the systems engineers thought.

    Read the article

  • Location-Based redirection and duplication in sub-directories affecting SEO

    - by Joshua
    I currently own the website www.xyz.com. The website has a sub-directory for each of the 3 target countries: .../en-US/ (United States), .../es-MX/ (Mexico), and .../es-DO/ (Dominican Republic). I have two main questions about this setup: Currently, the main domain/root (xyz.com) contains a blank index.php file, but I would like for a user to be redirected to one of the sub-directories based on their regional location. What is the best way to accomplish this? I have looked at using browser language-based redirection, but how would I know whether to direct a user to the MX or DO site if the browser language is set to spanish? Is there a way to detect a user's geographic location? Also, the 3 websites are practically identical except they all have 3 unique color schemes and the US site is in english while the MX and DO sites are in spanish. My problem is that I believe GoogleBot is penalizing/banning my site because the spanish text on the MX and DO pages are nearly identical and are thus marked as duplicates/spam. Is there a way to avoid this?

    Read the article

  • Restart single uWSGI application (when it's in emperor mode)

    - by Oli
    I'm running uWSGI in emperor mode to host a bunch of Django sites based on their individual configs. These are supposed to update when it detects a change in the config file and this largely works when I just touch uwsgi.ini the relevant file. But occasionally I'll mess something up in the Django site and the server won't load. Yeah, yeah, I should be testing better but that's not really the point. When this happens, uWSGI seems to mark the site as dead and stops trying to run it (seems to make sense). Even after I fix the underlying issue, no amount of touching will get that site's uWSGI process up and running. I have to reload the whole uWSGI server (knocking dozens of sites out at once for a few seconds). Is there a way to force uWSGI to just reload one of its sites?

    Read the article

  • Why do Google search results include pages disallowed in robots.txt?

    - by Ilmari Karonen
    I have some pages on my site that I want to keep search engines away from, so I disallowed them in my robots.txt file like this: User-Agent: * Disallow: /email Yet I recently noticed that Google still sometimes returns links to those pages in their search results. Why does this happen, and how can I stop it? Background: Several years ago, I made a simple web site for a club a relative of mine was involved in. They wanted to have e-mail links on their pages, so, to try and keep those e-mail addresses from ending up on too many spam lists, instead of using direct mailto: links I made those links point to a simple redirector / address harvester trap script running on my own site. This script would return either a 301 redirect to the actual mailto: URL, or, if it detected a suspicious access pattern, a page containing lots of random fake e-mail addresses and links to more such pages. To keep legitimate search bots away from the trap, I set up the robots.txt rule shown above, disallowing the entire space of both legit redirector links and trap pages. Just recently, however, one of the people in the club searched Google for their own name and was quite surprised when one of the results on the first page was a link to the redirector script, with a title consisting of their e-mail address followed by my name. Of course, they immediately e-mailed me and wanted to know how to get their address out of Google's index. I was quite surprised too, since I had no idea that Google would index such URLs at all, seemingly in violation of my robots.txt rule. I did manage to submit a removal request to Google, and it seems to have worked, but I'd like to know why and how Google is circumventing my robots.txt like that and how to make sure that none of the disallowed pages will show up in their search results. Ps. I actually found out a possible explanation and solution, which I'll post below, while preparing this question, but I thought I'd ask it anyway in case someone else might have the same problem. Please do feel free to post your own answers. I'd also be interested in knowing if other search engines do this too, and whether the same solutions work for them also.

    Read the article

< Previous Page | 254 255 256 257 258 259 260 261 262 263 264 265  | Next Page >