Search Results

Search found 18126 results on 726 pages for 'core pro'.

Page 147/726 | < Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >

  • Rackspace Cloud Servers in Europe?

    - by mit
    We have setup a cloud virtual server at rackspace in the US, but we use it from Europe. I found out I am not quite happy with the response time. Of course I knew that there would be some latency. But I am not sure if it is the overseas latency (ping is 120ms) or also the minimal resources. It is the smallest machine, 256 MB, 10 GB, running a Mediawiki on Ubuntu 10.04 64 bit. The Instance lives in the rackspace ORD1 datacenter. As soon as they have opened their new facilities in the UK we plan moving the incstance there. But we are planing more machines already. The pricing is quite attractive. I don't really want to do some measuring and benchmarking and this stuff, so I am asking just for your opinions and it would be nice to hear what you can tell from your experience. Maybe someone who uses such small instances in the US. And what can we really expect if we upgrade to more resources.

    Read the article

  • How do you find all the links to disavow for a Google reconsideration request? [duplicate]

    - by QF_Developer
    This question already has an answer here: How to identify spammy domains giving backlinks to my site (to submit in disavow links in WMT) 2 answers A few months ago I received the following notification on Google Webmaster for a website I look after. Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. The question here is, should we actively attempt to disavow these links given that the action is seemingly targeted to just a bunch of keywords? I've downloaded the inbound links sample from Google Webmaster and so far I've been through the disavow and reconsideration requests process 6 times, each taking 2-3 weeks only to be supplied just 2 more links that Google don't approve of. At this rate it will take me the rest of my natural life to cleanup all these spammy links! It seems disavowing is futile as they haven't implemented broad actions against the website as a whole and (from what I can gather) have already nullified the value of those offending links. Under the quoted statement above however is a reconsideration request button that seems to imply I should be actively doing something here? UPDATE 14th October -- I have since created a small .NET application that you can feed the CSV sample links file into from Google Webmaster. What this tool does is crawl all the links and looks for specific linking patterns as per some configurable match strings. I realised that many of the links that Google are taking issue with were created by a rogue SEO firm we hired several years ago. All the links are appended with 1 of 5 different descriptions. The application I built uses some regexes to isolate any link sources with these matching appendages and automatically builds the disavow txt file. In the end it had to come down to an algorithm as manually disavowing links on this scale would take weeks! I will post the app here once I've cleaned it up.

    Read the article

  • dynamic urls and links on one web page

    - by John
    I am trying to figure out how to create dynamic links and urls on a static webpage. What I want to do is the following: I have a single webpage for example: MYWEBPAGEdotCOM/INDEX.HTML that will always look the same, except for one link on the page. the link would be on the page for example: LINK TO AFFILIATE: affiliatedotCOM/my-affiliate_code_here_DYNAMIC_REFERER the only thing would change is the "DYNAMIC_REFERER" with every dynamic url on this page: MYWEBPAGEdotCOM/INDEX.PHP_id=test1 MYWEBPAGEdotCOM/INDEX.PHP_id=test2 MYWEBPAGEdotCOM/INDEX.PHP_id=test3 MYWEBPAGEdotCOM/INDEX.PHP_id=test4 which would only hange the dynamic link on the page to: affiliatedotCOM/my-affiliate_code_here_test1 affiliatedotCOM/my-affiliate_code_here_test2 affiliatedotCOM/my-affiliate_code_here_test3 affiliatedotCOM/my-affiliate_code_here_test4 Can someone tell me how I could go about doing this? I just dont want to have to make 100's of pages, as this would prevent me from having to do so.

    Read the article

  • Cheap Bulk Domain Registration

    - by Panoy
    I have 6-7 domain names that I have thought of and I'm planning to buy it in bulk so that I can save. Or am I wrong on this? In my case, since its my first time to this hosting/domain registration, I only knew of GoDaddy with regards to domain registration. Questions: Will I lose out if I chose a cheap domain registrar compared to one that's popular? For a newbie like me, what companies can you recommend for me to register domain names in bulk for cheap or affordable price? I notice that some prices are higher because they offer support and customer service? Aren't those servers not reliable at all? I've heard of some domain registrars that they're increasing their prices every renewal? Is that just natural in a business sense for these domain registrars? Before posting this, I've been reading about NameCheap.com, and I'm considering registering for them unless you have other good choices to give me. I'll appreciate every suggestion or advice you can give.

    Read the article

  • Formatting HTML lists using CSS

    - by pwaring
    I'm trying to recreate list in HTML which has clauses and subclauses like this: 1. Main Clause (a) Sub clause (b) Sub clause 2. Another main clause (a) Sub clause The problems I'm running into are: If I use the existing HTML elements (ol and li) there doesn't seem to be a list style for (a) - I can have a. b. c. or A. B. C. but not (a) (b) (c). If I don't use the existing HTML elements and start using span tags, then if a subclause runs beyond the end of the line it appears underneath the clause number, rather than being indented. Like so: (a) Very long subclause which goes over one line when what I really want is the behaviour from lists, which is: (a) Very long subclause which goes over one line Is there any way to get round these two problems at the same time? I'd prefer to use semantic HTML and CSS for styling, but having the clauses spaced correctly is more important than doing things 'the right way'. I may need subsubclauses at some point (i.e. (i), (ii) etc.), so I can't assume that (a) will be the maximum clause depth.

    Read the article

  • cURL works but PHP cURL fails to internet [migrated]

    - by wrk2bike
    Trying to diagnose an issue using PHP to cURL to an Internet location on a RedHat Linux server. cURL is installed and working, and: <?php var_dump(curl_version()); ?> shows all the correct information in the output. The issue is I can use PHP to cURL to localhost on the box itself, but not the Internet (see below). Normally I'd suspect the firewall, but I can cURL from the command line to the Internet without a problem. The box can also update it's own software packages, etc. What am I missing? My test is: <?php function http_head_curl($url,$timeout=30) { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_TIMEOUT, $timeout); curl_setopt($ch, CURLOPT_HEADER, 1); curl_setopt($ch, CURLOPT_NOBODY, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $res = curl_exec($ch); if ($res === false) { throw new RuntimeException("cURL exception: ".curl_errno($ch).": ".curl_error($ch)); } return trim($res); } // Succeeds, displaying headers echo(http_head_curl('localhost')); // Fails: echo(http_head_curl('www.google.com')); ?>

    Read the article

  • What is the best Apache logs Analyzer?

    - by Evgeny
    What real-time log analyzer can you suggest for Apache access and error logs? There is a list of web analytics software on wikipedia, but it would be great to hear opinions from people with experience without having to try all of them. Please don't suggest Google Analytics or any other hosted/javascript analytics suites, already using them, GA is not real-time and it is missing some data that the logs show. For example 404 errors, script errors, the full query-string of the referral, IP addresses, visitor path through the website, etc ...

    Read the article

  • Could Ajax + Caching be seen as cloaking?

    - by Angel
    I have a website where we use a technique to speed up loading times based in a combination of AJAX + caching. Basically, when we have a section in a page with content which is slow to retrieve, we first look if it's cached. If it is, then we serve the content, if it's not, we serve a placeholder and then make an AJAX call in the client to retrieve the content, wich is now cached for subsequent requests. As a consecuence, sometimes you get the entire page content in the first request, and sometimes you get those placeholders, wich get filled inmediatly with the responses of the AJAX request. You can see an example in the results count by category in the right column of this page: http://www.inzoco.com/crits/2-1-3-28-185-0-28079-0-0/listado-piso-en-alquiler-en-madrid-madrid.aspx I'm worried if it could be seen as cloaking by search engines because if you make a request for a page wich content isn't cached and then ask again for the same page, you would get different responses, the first with the placeholders and AJAX requests and the second one with al the content rendered.

    Read the article

  • CSS/HTML element formatting template

    - by Elgoog
    Im looking for a html template that has all the different types of html elements, so then I can take this template and start the css process. this way I can look at the full style of the elements without creating the styles for the different elements as I go. I realize I could do this myself but thought that some one else may have already done this or know where to find such a template. Thanks for your help.

    Read the article

  • dns hosting - url forwarding - hiding forwarded url?

    - by jeremycollins
    I have free dns hosting with the domain registrar and I'd like the dns hosted domain www.example.com to display contents of www.myotherlongdomain.com. I only have 301/302/iframe forwarding options, however I want to mask the redirected (longdomain) url. If I use frames, users can view the source and see the (longdomain) url the contents are coming from. How can I hide it so it always displays www.example.com? There is no cloaking/masking option with the registrar. Thanks.

    Read the article

  • Tool to identify Internet Explorer rendering differences with css

    - by Bakaburg
    I develop website using chrome and mac os as development environment. Since my audience is pretty specific I don't feel the necessity for too much backward compatibility with IE8 and less. However to my great dismay, even IE9 looks totally broken... I would like to know if there's on the web a tool that could tell me what probably went wrong with IE, that is a webapp that parse the rendered css and check which rules are probably totally broken in IE9.

    Read the article

  • How to keep menu in a single place without using frames

    - by TJ Ellis
    This is probably a duplicate, but I can't find the answer anywhere (maybe I'm searching for the wrong thing?) and so I'm going to go ahead and ask. What is the accepted standard practice for creating a menu that is stored in a single file, but is included on every page across a site? Back in the day, one used frames, but this seems to be taboo now. I can get things layed out just the way I want, but copy/pasting across every page is a pain. I have seen php-based solutions, but my cheap-o free hosting doesn't support php (which is admittedly a pain, but it's a fairly simple webpage...). Any ideas for doing this that does not require server-side scripting?

    Read the article

  • Is it a good idea to add robots "noindex" m tags deep, low content pages, e.g. product model data

    - by Cognize
    I'm considering adding robots "noindex, follow" tags to the very numerous product data pages that are linked from the product style pages in our online store. For example, each product style has a page with full text content on the product: http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE Then many data pages with technical data for each model code is linked from the product style page. http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-1 http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-2 http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-3 It is these technical data pages that I intend to add the no index code to, as I imagine that this might stop these pages from cannibalizing keyword authority for more important content rich pages on the site. Any advice appreciated.

    Read the article

  • 301 redirect bulk aspx URLs on IIS

    - by tiki16
    We recently relaunched an old ASPX site as a new Drupal site on the same domain. No 301 redirect was implemented. I have outputted a list of 1000 URLs that need to be 301 redirected. Most of the URLs are the results of search queries that were committed on the website. I.E.: http://www.mysite.com/electronics/CommunityDetails.aspx?FirstLetter=%&ID=444 We are running a Drupal site on IIS using a PHP plugin. Is there a way I can wild card a redirect of all ASPX pages? I know I can do it with .htaccess but that doesn't apply here. Any suggestions appreciated.

    Read the article

  • Favorite Visual Studio 2010 Extensions, Update

    - by Scott Dorman
    With the release of the Visual Studio Pro Power Tools (and many other new extensions having been released), my list of favorite Visual Studio extensions has changed. All of these extensions are available in the Visual Studio Gallery. Here is the list of extensions that I currently have installed and find useful: Bing Start Page CodeCompare Collapse Selection In Solution Explorer Collapse Solution Color Picker Completion Extension Analyzer Find Results Highlighter Find Results Tweak (Available from CodePlex) Format Document HelpViewerKeywordIndex HighlightMultiWord Image Insertion Indentation Matcher Extension ItalicComments MoveToRegionVSX Numbered Bookmarks PowerCommands for Visual Studio 2010 Regular Expressions Margin Search Work Items for TFS 2010 Source Outliner Spell Checker Structure Adornment This also installs the following extensions: BlockTagger BlockTaggerImpl SettingsStore SettingsStoreImpl StyleCop Team Founder Server Power Tools TFS Auto Shelve Visual Studio Color Theme Editor Visual Studio Pro Power Tools VS10x Code Map VS10x Code Marker VS10x Collapse All Projects VS10x Editor View Enhancer VS10x Insert Debug Names VS10x Selection Popup VS10x Super Copy Paste VSCommands 2010 Word Wrap with Auto-Indent   Technorati Tags: Visual Studio,Extensions

    Read the article

  • How to handle canonical url changes like Stack Overflow

    - by lulalala
    Stack Overflow sites all have pretty urls which include the question title. In the HTML it also have canonical url for that page. I just found out that when I change the question title, the url is changed immediately. The canonical url is also updated. Does it mean that as long as the page with the old canonical url redirects to the new canonical url, then search engines will update their records of the canonical url as well? Is there anything else that one can actively do to make the url change even more smoother?

    Read the article

  • Can preventing directory listings in WordPress upload folders cause Google ranking drops when they cause 403 errors in Webmaster Tools?

    - by Kelly
    I recently moved to a new host that blocks crawling to my uploads folders but (hopefully) allows the files in the folder to be crawled. I now show many 403 errors for each folder in the uploads folder in my Webmaster Tools. For example, http://www.rewardcharts4kids.com/wp-content/uploads/2013/07/ shows a 403 error. For example, I can access this file: http://www.rewardcharts4kids.com/wp-content/uploads/2013/07/lunch-box-notes.jpg but I cannot access the folder it is in. My rankings went down after I moved to this host and I am wondering if: this could be the reason. is this how files/folders are supposed to be set up?

    Read the article

  • Tracking pages with variables in GA

    - by Imran
    Recently I have updated my site, it now passes a variable on some links like so... www.mysite.com/1234/?play=true I've noticed in Google Analytics it records www.mysite.com/1234/ and www.mysite.com/1234/?play=true as two different URL's. Is there a way to merge them because they are after all just one page, It makes "Top Content" for example hard to read because of dupilicates. I've read about something called canonical link tag which may help this? My blog has this already inserted into the head but it doesnt make a difference. Any suggestions?

    Read the article

  • Terms and conditions for a commerce site

    - by Mantorok
    I am developing a website for my partner who is currently a sole trader, presently selling on ebay but we have opted to create our own site. I've noticed that there are many sites allowing you to purchase base-line T&Cs to be used on websites, I'm tempted to give these a go but I've heard nothing on whether they are any good or not, I know in this position it's best to seek legal advice, but the budget is tight so we really can't afford that. Has anyone had experience with these sites? e.g. http://www.netlawman.co.uk/ecomm-it/website-terms-and-conditions.php?gclid=CPL4g8D3q6cCFQoa4Qodhj5UBg. Thanks

    Read the article

  • How should I prepare the design of a web page for a web developer?

    - by jackal
    What techniques, software or practices do you use to prepare a description of a web page for further development? I am doing some research (with little luck) in how to create description for web developers - what should be included on the web page (inputs widths, font sizes, images placement, etc). Right now I use a combination of Excel and Word documents. In complex cases this is inefficient. Any other suggestions?

    Read the article

  • Configure PHP and Apache in Windows 7

    - by manxing
    I installed apache server successfully in Window 7, 32bit system. It showed "It works" in the webpage. I also configured <?php phpinfo(); ?>file as info.php. But when I tried to open http://localhost/info.php in the browser, all I can get is exactly: <?php phpinfo(); ?>in plain text. I restarted Apache server everytime I made changes. Anyone can help with this? Many tnanks in advance!

    Read the article

  • www.foobar.com works but foobar.com results in a 'Server not found' error

    - by Homunculus Reticulli
    I have just setup a minimal (hopefully secure? - comments welcome) apache website using the following configuration file: <VirtualHost *:80> ServerName foobar.com ServerAlias www.foobar.com ServerAdmin [email protected] DocumentRoot /path/to/websites/foobar/web DirectoryIndex index.php # CustomLog with format nickname LogFormat "%h %l %u %t \"%r\" %>s %b" common CustomLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foobar.access.log" common LogLevel notice ErrorLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foobar.errors.log" <Directory /> AllowOverride None Order Deny,Allow Deny from all </Directory> <Directory /path/to/websites/> Options -Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> </VirtualHost> I am able to access the website by using www.foobar.com, however when I type foobar.com, I get the error 'Server not found' - why is this? My second question concerns the security implications of the directive: <Directory /path/to/websites/> Options -Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> in the configuration above. What exactly is it doing, and is it necessary?. From my (admitedly limited) understanding of Apache configuration files, this means that anyone will be able to access (write to?) the /path/to/websites/ folder. Is my understanding correct? - and if yes, how is this not a security risk?

    Read the article

  • SEO: many stackoverflow users' pages have got no Google PR and they are not indexed, why?

    - by Marco Demaio
    If you go to my user page on Stack Overflow and you check it with the Gogle bar you can see has got no PR at all (this does happen for almost any user page, even people with much higher reputation, the only exceptions seem to be the users in page 1, and some other users they have PR). My user page's Page Rank is not only zero, but not calculated at all. When PR is 0 or less than 1, but calculated the Google bar shows white, but when the PR is not even calculated like in my user page the Google bar shows in grey. I further more discovered that my user page is NOT EVEN INDEXED on Google, simple test is searching on Google for the exact page url: "http://stackoverflow.com/users/260080/marco-demaio" and you will see no result. The question is how can this be??? This is really weird to me because of the following reason: If you search on Google for "Marco Demaio" on stackoverflow site only (you can do this by searching "site:stackoverflow.com Marco Demaio") the search result shows hundreds of 'asking/answering questions' pages where I was 'tagged'!!! Let's check one of these: the 1st one that appears now (shows one of the question I asked). We can be sure this page is indexed in Google because comes out in a search moreover its PR is calculated, it's probably nearly zero, but still some PR flows there, the PR bar is not grey, but white: The page shown above has got links to my own user page. I checked the source code of the page shown above and the links are not hidden or set with a rel="nofollow", moreover I can't see any meta character excluding the links on the page from being followed. So what's happening? Why Google does not see my user page at all. Did stackoverflow do something to achieve this? If yes what did they do? Any explantion really appreciates (as always). P.S. obviously I checked also the code of my user page, but I could not find meta tags excluding Google search for the page. P.S. 2 in a desperate adventure I also checked StackOverflow robots but it does not seem to exclude user pages. UPDATE 1 following up on some answers, I did some more research. Excluding for a while the PR problem (since PR is not science), and looking only at the user page on StackOverflow NOT BEING INDEXED problem: pages do not seem to be indexed by Google because of the user reputation, this user for instance has got NOW 200 points less reputation than me and his page is indexed (while mine not). It does not seem even to be connected with months you have been on Stackoverflow, this user (almost my same reputation) has been there for 3 months only and his page is indexed (while mine not and I have been a user for 7 months). It's bizzarre! UPDATE February/2011 As of today the page got indexed by Google at least when you search for "site:stackoverflow.com Marco Demaio" it's the 1st page. The amazing thing is that it has still got NO PageRank at all: Google toolbar states loud and clear "No PageRank information available". It's odd!

    Read the article

  • Apache mod_rewrite for multiple domains to SSL

    - by Aaron Vegh
    Hi there, I'm running a web service that will allow people to create their own "instances" of my application, running under their own domain. These people will create an A record to forward a subdomain of their main domain to my server. The problem is that my server runs everything under SSL. So in my configuration for port 80, I have the following: <VirtualHost *:80> ServerName mydomain.com ServerAlias www.mydomain.com RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule /(.*) https://mydomain.com/$1 [R=301] </VirtualHost> This has worked well to forward all requests from the http: to https: domain. But as I said, I now need to let any domain automatically forward to the secure version of itself. Is there a rewrite rule that will let me take the incoming domain and rewrite it to the https version of same? So that the following matches would occur: http://some.otherdomain.com -> https://some.otherdomain.com http://evenanotherdomain.com -> https://evenanotherdomain.com Thanks for your help! Apache mod_rewrite makes my brain hurt. Aaron.

    Read the article

  • Exit link tracking with timestamped logs on 3rd party content

    - by dandv
    I want to track clicks on exit links, that are placed in 3rd party content, for example on Twitter. I also need the timestamps of the clicks. Google Analytics can't be embedded in 3rd party content. Another solution is to use a URL shortener like bit.ly. However, bit.ly or goo.gl don't log the time of the click with any better granularity than a full day. su.pr shows the time for the past day in its analytics graph. The analytics download only includes the day, not the time. cli.gs was touted as having the most detailed analytics, yet it doesn't show the time either, and forces the user through a preview page. Any ideas?

    Read the article

< Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >