Search Results

Search found 96383 results on 3856 pages for 'code pro'.

Page 565/3856 | < Previous Page | 561 562 563 564 565 566 567 568 569 570 571 572  | Next Page >

  • joomla sometimes messes up urls, probably cache involved

    - by Bakaburg
    Is a bit i'm having this problem and i really cannot get the hang of it... Every once in while my joomla site messes up links url and for example from something like this: http://www.sism.org/index.php?option=com_comprofiler&task=userslist&listid=4&Itemid=123 it becomes like this: http://www.sism.org/index.php/component/k2/administrator/components/com_dump/assets/css/images/stories/inrilievo/sism/htm/index.php?option=com_comprofiler&task=userslist&listid=4&Itemid=123 the new page has the right content but there are no css and other linked resources. Usually i solve the problem by deleting all the cache and turning it off and on again. Of course this is pretty annoying especially for my association. Does any one have any clue on this? Watching the URLs the components involved seems to be K2 and Jdump. Thanks

    Read the article

  • Why Wikipedia doesn't appear as a referral in Google Analytics' Traffic sources?

    - by Rober
    One of my clients has a website and got not spammy backlinks in a Wikipedia article. When I test it for SEO purposes with Google Analytics (from different IPs), apparently there is no referral information. On the Real-Time view my test visit is visible but with There is no data for this view in the referrals subview. And this visits appear as (direct) / (none) on the Traffic sources view. Wikipedia is not hiding in any way its links origin, since it is shown in the server visits log. Is Google ignoring Wikipedia as a referral? Am I missing anything else? Update: Now it works, several days after the link was active. Maybe something is detecting for how long the link was there so that it doesn't work just from the beggining, as a security measure? Many visits are actually not recorded.

    Read the article

  • Can I include a robots meta tag outside of the head in HTML snippets indeded to be SSIed?

    - by Dan
    I have a number of files in my site which are not intended for independent viewing, but rather to be AJAXed into content within the site. They obviously don't meet HTML standards (no body, head, etc.) as independent entities. I would like to prevent search engines from indexing these pages, but do not have access to /robots.txt (which would be much more ideal). My question is, could I include the following at the top of these partial HTML files and get the desired results? <meta name="robots" content="noindex, noarchive"> I guess there are two parts to this question. Will this cause any rendering issues in any browsers? Will search engines (at least Google & Bing) interpret this as intended?

    Read the article

  • AdSense sent an email saying my account has been approved when it already was approved

    - by moomoochoo
    My account has been approved and running adverts for quite sometime now. However, today I just got a message (it seems legitimate) from Google AdSense saying: Congratulations, your AdSense account has been approved to show AdSense ads on your own website. Within a few hours, you will begin to see live ads. Should I be concerned? They say that they review accounts to check for compliance, could this be some weird way of saying they rechecked my sites and they complied?

    Read the article

  • Subdomain redirect to WWW

    - by manix
    I have the domain example.com and the test.example.com running on apache server. For some reason when I try to visit test.example it is redirected to www.test.example and by consequence a Server not found error is displayed in the browser. Both .htaccess (root and subdomain folder) files are empty. Additional facts I have another subdomain xyz.example.com pointed to public_html/xyz directory with some content inside (index.html with "hello world message") and it works fine if I use xyz.example.com instead of www.xyz.example.com. So, can you help me to point to the right direction in order. I have a vps and I am able to change any file if is required. Below you can find my virtual host configuration. <VirtualHost xx.xxx.xxx:80> ServerName test.example.com ServerAlias www.test.example.com DocumentRoot /home/example/public_html/test ServerAdmin [email protected] UseCanonicalName Off CustomLog /usr/local/apache/domlogs/test.example.com combined CustomLog /usr/local/apache/domlogs/test.example.com-bytes_log "%{%s}t %I .\n%{%s}t %O ." ScriptAlias /cgi-bin/ /home/example/public_html/test/cgi-bin/ # To customize this VirtualHost use an include file at the following location # Include "/usr/local/apache/conf/userdata/std/2/example/test.example.com/*.conf" </VirtualHost>

    Read the article

  • Is there a way to hide text from descriptions in Google search results

    - by Linda H
    The first line of text on all of our client's product pages is "Download hi-res images", which of course isn't what we'd want in the description when people search for their products. Is there any way to hide this text/link so that Google and the others just ignore it and go on into the text description below? I suppose we could use a meta-description, but the client isn't very good at computers and it's such a small site it seems silly.

    Read the article

  • Affect on speed of wordpress membership plugins -- currently trying s2member [migrated]

    - by Richard
    I'm taking a look at s2member -- I have it running, and my site is very slow -- it's taking on average about 9 or 10 seconds to load. This is the site: http://richardclunan.net I want to figure out if the s2member plugin is causing it to be slow. And whether there are other faster membership plugins... 3 questions: Are there particular settings or things specific to s2member that I should take care of to ensure s2member doesn't make my site slow? If I deactivate the plugin to test the speed of the site with the plugin deactivated, will that mean I'll have to respecify s2member settings when I reactivate it? After it's reactivated will members' accounts work ok? Anybody have observations on s2member or other wordpress membership site plugins and their affect on site speed?

    Read the article

  • average screen ratio

    - by sam
    Im building a portfolio website that uses full screen background images slideshow that are cropped to fit using a js plugin. To give the minimum amount of cropping whats the best ratio to make the images ? ie i know 13" macbooks are around 13:7 (when taking into account about 100px for the browser bar) but does that scale up on 15",24",17" displays ? I know there are charts showing the most common dimensions but they just show a range of sizes and thats categorized by groups rather than actual dimensions

    Read the article

  • Are icon fonts bad for SEO?

    - by user359650
    Instead of using <img> tags for your icons, you can use icon fonts on <span> tags (which offer some advantages such as not having to create a sprite, being able to scale icons up/down without degrading quality...). However, by using an icon font you give up the <img> alt attribute (that attribute can help you with SEO). There is a way to add text to the <span> and hide it, but I wonder whether this is recognized / penalized by Google (as it seems to go against the quality guidelines). Are icon fonts bad for SEO (i.e. by using icon fonts you give up the alt attribute) ? Would inserting text in font icon tag and hiding it with CSS (text-indent: -9999px) be recognized / penalized by Google ?

    Read the article

  • Domain from A and hosting from B

    - by Zero
    I have buyed domain from one company and hosting from another. On hosting company website finded DNS addresses and applied them to domain hosting website(changed DNS) I done it yesterday, so today it should work, but: Unable to resolve the server's DNS address appears. In direct admin control panel (DNS control) i have (it's my hosting company settings): http://pastebin.com/MGbQ02hr Note: IP and domain hidden! Any ideas whats wrong ?

    Read the article

  • Getting the masked URL values in Mediawiki

    - by Kalai
    I have successfully masked the URL in Mediawiki. By using the following scripts in .htaccess and localsettings.php files in Mediawiki, i.e.: .htaccess: Options +FollowSymLinks RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)/(.*)$ /mediawiki/index.php?title=$1&actions=$2 [L] Localsettings.php: $wgScriptPath = "/lib/mediawiki"; $wgArticlePath = "/lib/mediawiki/$1/$2"; It is working fine with required URL. But my problem is I want to consider the second parameter as a querystring for my pages. But I could not get the second parameter in my file. I tried with $wgrequest function but it is only giving the first parameter as title. I tried with $_REQUEST also, it is sometimes give the value of $_REQUEST['actions']. But many times not. I cant understand what is the problem.

    Read the article

  • Does Fetch as Googlebot still support their ajax-crawling proposal?

    - by Gunchars
    I spent half a day implementing the server side html generation for modal pages based on their proposal (link), but it seems like the Fetch as Googlebot functionality in Webmaster tools completely ignores the URL fragment. I've verified that the _escaped_fragment_ functionality is working on my server (example), but when I submit a URL like /#!/recipes, the Googlebot just fetches /. There aren't any recent confirmations that it's working and, honestly, it wouldn't surprise me if they just silently dropped the functionality without even editing the docs.

    Read the article

  • Will this sitemap get me de indexed from Google?

    - by heavy rocker dude
    My site's URL (web address) is: http://quantlabs.net/private/sitemap.xml Description (including timeline of any changes made): Will this sitemap get me de-indexed from Google? My new site map just got spidered by Google for some reason. It is located at http://quantlabs.net/private/sitemap.xml, is this in danger of getting me de-indexed from Google's index. Does it look like spam even though it is not meant to be? I am trying to figure the limitation in terms of Google's threshold before they deem it a spammy sitemap. This is sitemap contains automated postings which are different with the stock symbol provided. The amount of postings within the Sitemap are quite a few in a small amount of time.

    Read the article

  • Looking for a shuffle radio mp3 player

    - by ofir
    I'm looking for a shuffle radio MP3 player that I can embed in my site which: shows album and title and link to buy track and album in shop Like this one: http://phpfoxmods.net/clients/productsview.php?id=7 But this is integrated with MP3 community. MP3 broadcast should be secured, free or paid solution? Better integrated with shop or WP? Simple shop like this: http://idevspot.com/demos/idev-musicshop/index.php

    Read the article

  • Sitemap.xml generator

    - by miller55
    I need some sitemap generator that support unlimited pages. Also is there some generator that support that i remove some pages from crowing. For example i have some pages like reviews pages which are generate automatically when someone add review. They all look like: mysite.com/review_1.hml mysite.com/review_2.hml and so on ...... so i dont want to sitemap generate those pages. Thanks in advance!

    Read the article

  • How do I remove these errors from my blog so as to get adsense approved?

    - by Serenity
    This is the question I asked on SO site earlier, but didn't get satisfactory replies. hoping to find a solution here.. http://stackoverflow.com/questions/12136796/how-can-i-detect-and-correct-these-errors-on-my-blog/12136829#comment16235061_12136829 In web master tools, apart from the errors in the question link above, it is showing a site map error too as in the screenshot below:- Need guidance please...thanks :) Edit -1 EDIT 2 I had 2 SEO plugins on my blog and I would put meta description for each of my article in both plugins that are All in One SEO and Yoast's "Wordpress SEO". Now I removed all article's meta descriptions from "All in one SEO" the other day but STILL web master tool is showing duplicate meta tags and descriptions. Why??

    Read the article

  • 301 re-direct all external links to new domain

    - by Dean Legg
    I have changed the main domain to a sub-domain & would like to re-direct all external links to the new sub domain. Have read a few articles but having no luck editing the .htaccess as it might be interfering with all the rules in there. Old: www.example.co.uk New: https://secure.example.co.uk The current rules are quite handy because it seems to have sorted out the structure for all internal links. It has even updated the file path for images (or this could just be wordpress as the url was updated under general settings). This is the current .htaccess <files wp-config.php> order allow,deny deny from all </files> # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress

    Read the article

  • Which CMS for photo-blog website?

    - by Gacek
    I need to add photo-blog to a site that I'm recently working on. It is very simple site so the blog doesn't have to be very sophisticated. What I need is: a CMS that allows me to create simple blog-like news with one (or more) images at the beginning and some description/comment below. Preferably, I would like to create something that works like sites like these two: http://www.photoblog.com/dreamie or http://www.photoblog.pl/mending/ it must be customizable. I want to integrate it's look as much as possible with current page: http://saviorforest.tk preferably, it should provide some mechanizm for uploading and storing images at the server. I thought about wordpress, but it seems to be a little bit too complicated for such simple task. Do you know any simple and easy in use CMS that would work here?

    Read the article

  • Leveraging a hosted web font service from a local development server?

    - by Tom Auger
    There are a number of popular web font services on the market today who "host" the fonts and serve them to your web page via javascript or CSS pointing to remote locations. For example http://webfonts.fonts.com or http://typekit.com However, there seems to be an issue when you're developing on a local testing server - the remote font services don't validate the font and return 403 access denied errors and the like. What workarounds are there for using remote services such as a hosted font service, on a local development server?

    Read the article

  • Problem with missing JSON functions on PHP 5.2.6 / Plesk 8.4

    - by Drachenviech
    I have a vserver running openSuse 10.3, Apache 2 and Plesk 8.4. I can update/upgrade neither, as it is apparently not recommended to upgrade openSuse 10.3 (and an update to the EOL 10.4 does not seem to make much sense) and Plesk fails to update no matter what version I try (even fails to upgrade to 8.4.1). Still I can live with that somehow, primarily because I don’t have the time to do a fresh remote install on the vserver. What really is a problem is, that though the installed PHP is 5.2.6 it has no zip library and no json functions. The first is probably because PHP was not compiled with --enable-zip. The second is a big mystery though. As I understand it, it always comes with PHP unless its compiled with the --disable-json configure option. This is however not the case. And the json extension module is just not there. I even tried to enable it with extension=json.so with no luck either. the configure options of my PHP are (as shipped with Plesk 8.4) '../configure' '--prefix=/usr' '--datadir=/usr/share/php5' '--mandir=/usr/share/man' '--bindir=/usr/bin' '--with-libdir=lib' '--includedir=/usr/include' '--sysconfdir=/etc/php5/apache2' '--with-config-file-path=/etc/php5/apache2' '--with-config-file-scan-dir=/etc/php5/conf.d' '--enable-libxml' '--enable-session' '--with-mm' '--with-pcre-regex=/usr' '--enable-xml' '--enable-simplexml' '--enable-spl' '--enable-filter' '--disable-debug' '--enable-inline-optimization' '--disable-rpath' '--disable-static' '--enable-shared' '--program-suffix=5' '--with-pic' '--with-gnu-ld' '--with-system-tzdata=/usr/share/zoneinfo' '--with-apxs2=/usr/sbin/apxs2' '--disable-all' '--disable-cli' As I understand it, PECL is not an option with 5.2.6. Or am I mistaken? Even if I was not, the openSuse repository only goes as far as PHP 5.2.4. The openSuse install even came without zypper, which I had to manually install. So is there a way to get ziplib and json running in PHP 5.2.6 without having to recompile the binary?

    Read the article

  • SEO for images: can I use a different (cookieless) domain?

    - by Oliver
    Hello, We want to increase the value of some of our important images by means of SEO, and we want to start serving them from a different, i.e. cookieless, domain. We want to go from http://www.example.com/images/1234.jpg to http://www.example.com/germany/bavaria/landscape.jpg which can easily be done via URL rewriting. Then on the other hand, we would like to serve the image from a completely different domain, let's say http://www.examplestatic.com/germany/bavaria/landscape.jpg, to save the overhead of sending the cookie from www.example.com. Somehow I feel that this is not a good idea because I move the image away from the content by putting it on a different domain. Can anyone shed some light on this problem? Naturally, I would just use a different subdomain, e.g. img.example.com, but we already use subdomains for languages and our cookies are valid for all subdomains of example.com, so this won't help. I'd really appreciate any hints. Cheers,

    Read the article

  • Should I prevent search engines indexing tag/category pages?

    - by Macha
    On my site, I currently have no special rules for search engines. It is a blog, statically generated using a Python program. When I search for some of my articles on Google, there is usually a tag or category page included in the results. Sometimes it even ranks ahead of the article itself. Obviously, as these links aren't always going to have the article on them, this aren't the results I want people to click on. So, I'm thinking of setting noindex on these pages. Is there any possible downside to doing so? Is this possible to do via robots.txt, or do I have to add it to all the relevant templates? All I can find for robots.txt are ways to stop the search engine crawling those pages, which isn't what I want - while I don't want them indexed, it's still the only surefire way to find all my blog posts.

    Read the article

  • PHP-FPM stops responding and dies [migrated]

    - by user12361
    I'm running Drupal 6 with Nginx 1.5.1 and PHP-FPM (PHP 5.3.26) on a 1GB single core VPS with 3GB of swap space on SSD storage. I just switched from shared hosting to this unmanaged VPS because my site was getting too heavy, so I'm still learning the ropes. I have moderately high traffic, I don't really monitor it closely but Google Adsense usually record close to 30K page views/day. I usually have 50 to 80 authenticated users logged in and a few hundred more anonymous users hitting the Boost static HTML cache at any given moment. The problem I'm having is that PHP-FPM frequently stops responding, resulting in Nginx 502 or 504 errors. I swear I have read every page on the internet about this issue, which seems fairly common, and I've tried endless combinations of configurations, and I can't find a good solution. After restarting Nginx and PHP-FPM, the site runs really fast for a while, and then without warning it simply stops responding. I get a white screen while the browser waits on the server, and after about 30 seconds to a minute it throws an Nginx 502 or 504 error. Sometimes it runs well for 2 minutes, sometimes 5 minutes, sometimes 5 hours, but it always ends up hanging. When I find the server in this state, there is still plenty of free memory (500MB or more) and no major CPU usage, the control and worker PHP-FPM processes are still present, and the server is still pingable and usable via SSH. A reload of PHP-FPM via the init script revives it again. The hangups don't seem to correspond to the amount of traffic, because I observed this behavior consistently when I was testing this configuration on a development VPS with no traffic at all. I've been constantly tweaking the settings, but I can't definitively eliminate the problem. I set Nginx workers to just 1. In the PHP-FPM config I have tried all three of the process managers. "Dynamic" is definitely the least reliable, consistently hanging up after only a few minutes. "Static" also has been unreliable and unpredictable. The least buggy has been "ondemand", but even that is failing me, sometimes after as much as 12 to 24 hours. But I can't leave the server unattended because PHP-FPM dies and never comes back on its own. I tried adjusting the pm.max_children value from as low as 3 to as high as 50, doesn't make a lot of difference, but I currently have it at 10. Same thing for the spare servers values. I also have set pm.max_requests anywhere from 30 to unlimited, and it doesn't seem to make a difference. According to the logs, the PHP-FPM processes are not exiting with SIGSEGV or SIGBUS, but rather with SIGTERM. I get a lot of lines like: WARNING: [pool www] child 3739, script '/var/www/drupal6/index.php' (request: "GET /index.php") execution timed out (38.739494 sec), terminating and: WARNING: [pool www] child 3738 exited on signal 15 (SIGTERM) after 50.004380 seconds from start I actually found several articles that recommend doing a graceful reload of PHP-FPM via cron every few minutes or hours to circumvent this issue. So that's what I did, "/etc/init.d/php-fpm reload" every 5 minutes. So far, it's keeping the lights on. But it feels like a dreadful hack. Is PHP-FPM really that unreliable? Is there anything else I can do? Thanks a lot!

    Read the article

  • I've changed my URL schema. How do I tell Google to index the new schema and forget the old one?

    - by growse
    I had a site where the urls were constructed like this /index.php/Topic /index.php/AnotherTopic These were indexed in google, and search results returned that pointed to these. However, I've recently replatformed that site, and reconfigured it so the above urls would be: /index.php?title=Topic /index.php?title=AnotherTopic The original urls are returning 404s. The site is linking to the correct URL schema internally, but Google is retaining the original schema in its search results. I've updated and resubmitted the sitemap which only contains the new schema. Also, Google's webmasters tool is going slightly bananas at the fact there's now a spike in 404 errors in its crawl results. What would be the best approach to get Google to 'forget' about the old schema, and instead index the new schema? Should I try blocking /index.php/ in robots.txt? Should I be returning 301 codes instead of 404 for the original urls?

    Read the article

  • How can I determine the trending pages on my site?

    - by Dogweather
    I'm looking to what what the "hot" pages are on one of my sites. I want to see for various timeframes, what the top-50 pages are. I'm going to create a data feed with this info which will be input to another app. I have Apache logs, and complete control of the machine to install what I want. I'm mostly wondering if there's something out there already that I can use, or if I have to implement it myself, what good algorithms or strategies might be. Thanks.

    Read the article

< Previous Page | 561 562 563 564 565 566 567 568 569 570 571 572  | Next Page >