Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 235/389 | < Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >

  • 100% APC Fragmentation - Cacherouter & Pressflow install

    - by granttoth
    My APC cache has a 100% fragmentation. I'm not quite sure I understand what is going on here. For testing I jacked the available memory up to 512. After a day the total available free memory shows 73% but I still have 100% fragmentation. Would you gurus please look at my settings and offer your advice? Oh, I have read people suggest that I disable apc.stat when possible but when I do the site crashes. I am using the Pressflow build of Drupal 6 with the cacherouter module installed. Edit: (added screenshot) http://i.imgur.com/DqZEX.png

    Read the article

  • Change the output of the facebook like button [migrated]

    - by Mechaflash
    I've tied in the code for the facebook like button from the facebook dev site (I currently use the iframe version) directly into the html of a node (or content post). I want to be able to manipulate what text is sent when someone hits the like button. you can see the site and buttons here www.masteringmoneybasics.org I've tried both the iframe and html5 versions of the button and can't see where to alter what is sent. If there is no way to directly alter what is sent, does anyone know what it looks for in content to be sent so I can structure the node correctly? If you notice, when you like the page, it doesn't get the first sentence of the content, but all content after it, and I've tried putting different content in between the two lines (in its own <p>) and it still grabs the latter. Also how it figures out which image to grab from the page? Most times it doesn't take any image, however twice it's grabbed the middle school image.

    Read the article

  • What framework for text rating site?

    - by problemofficer
    I want to start a "rate my"-style site. The rated objects are mostly texts. I want it to be rather simple. Features I need: object rating (thumb up, thumb down) object comments object tags related object presentation based on tags user authentication and management private message system sanity checks for text inputs (i.e. prevention of code injections) cache open source runs on GNU/Linux I would gladly take something that is tailored for my scenario but a generic framework would be fine too. I simply don't want to write stuff like user authentication that is been written a million times and risking security flaws. Programming language is irrelevant but python/php preferred.

    Read the article

  • Will using HTTPS hurt my site's SEO or other statistics?

    - by yannbane
    I've set up a WordPress blog. Since I have to log into it from many different locations/machines, I've also got an SSL certificate, and set up Apache to redirect HTTP to HTTPS. It all works, but I'm wondering whether that's an overkill. Since most people who go to my site don't have to log in, I'm starting to wonder whether HTTPS has some drawbacks. If so, should I look for a way to make HTTPS optional?

    Read the article

  • Can you disavow a whole domain apart from the index page?

    - by Silver89
    Many years ago I may have bought a few sitewide links for some of my sites, these have now come back to haunt me and I need to sort them out. I've tried to contact the owners but they're too lazy to bother changing the sites so I figure it's time to disavow the links. But is there a way to disavow all of the sitewide links on the domain apart from on the index page and would this be a benefit to leave the index or would it still be seen as spammy? Something like ... # Contacted owner of shadyseo.com on 7/1/2012 to # ask for link removal but got no response domain:shadyseo.com !shadyseo.com/index.php

    Read the article

  • Crossbrowser issue - navigation-menu [closed]

    - by aztekk
    I'm having issues with crossbrowser compatibility on my navigationmenu for my site. The issue is that it's not working as expected in MSIE. It bugs out on mouseover. The site is run with wordpress and the theme is called GreenChilli. It's a free theme from MyThemeShop and they don't seam to be very active in resolving free theme issues on their forum. Can someone have a look and see if this is an easy fix, or if I maybe have to abaondon this theme for something else? Site is: http://lamslagen.com

    Read the article

  • How do I recover a site after WordPress' Automatic Update has failed?

    - by Metacom
    I manage several WordPress websites, and successfully recently used the Automatic Upgrade feature to bring them up to 3.1. However, on one of the sites, I received a 503 (I believe it was 503). After that I was presented with "The service is unavailable." whenever I tried to access any page, including the index page, wp-admin, etc. I had similar problems before when a WordPress site got stuck in maintenance mode, and all I needed to do was login via FTP and rename or delete the .maintenance file. I tried that in this case, but that didn't do the trick. I am now presented with "Fatal error: Call to undefined function require_wp_db() in **\wp-settings.php on line 71" I can't figure out how to fix this problem, and I was wondering if anyone else had any ideas. Any suggestions are appreciated! Thank you for reading.

    Read the article

  • Is it possible to tell a search engine not to index a specific section of an HTML page? [closed]

    - by Justin
    Possible Duplicate: Preventing robots from crawling specific part of a page I know you can use robots.txt to ignore entire pages or sections of your site, but is there a way to tell cralwers like the Googlebot to ignore specific sections of an HTML page? I found this blog post that discusses one method, but it appears only to work for the Google Search Appliance, not the Googlebot. Is there some method for at least Google for to do this?

    Read the article

  • After replacing all tables in an old website with divs, what other steps should I take?

    - by guisasso
    I have designed a website a few years back, and it ranks pretty well, customer is happy, no problems there. I took one of the pages and replaced manually all the tables with divs, used structured data and got the website to look exactly the same. I would like to know, what other steps should I take to improve or at least not hurt this page's rank, or if perhaps I should juts not bother altogether. What are best practices here? The page is not live yet. Thanks.

    Read the article

  • Is it legal to charge extra fees for copyrighted content on mobile platforms?

    - by Macrow Willson
    this question just came up as we recently bought content from image stock portals. Many of those altered their license agreement in favor of charging more for using in mobile apps. So instead of using their standard licenses, you need to pay an "extended" licenses which multiplies the fee easily by 5-10. That doesn't make sense as the mobile device is just a smaller browser and protects the content even better than a desktop computer. Are those stock agencies allowed to do that, and is it legal at all ? I am not a lawyer but I would even risk to go on with the standard license and wait to be sued in that matter.

    Read the article

  • Search Result Organization

    - by Vecta
    I'm creating an AJAX live search on a website I'm working on. Users will select values from a few dropdowns and a list of products will be returned based on what they select. Some possible fields would be: color, model, make, etc. What type of organization of search results do users tend to find most useful? Is it better to lump them all together (alphabatized) or is it more useful to lump them together by make? In the past I've tended to group them by "make" but I'm not concerned that this will continually force some items with a make toward the end of the alphabet always to the bottom of the list. Any tips are greatly appreciated.

    Read the article

  • Website migration from WordPress to a static site and doing 301 redirects without access to existing site?

    - by user3114468
    Currently working on a project that is a hosted on WordPress that is being migrated to a static site. However I presently do not have access to the existing site as it's managed by another developer. The concern is not the lack of having access to content as the site owner has generated very little content (reason for migration) and we were able to do this manually. Rather the concern is to do 301 redirects. The site will not change domains but URLs such as from example.com/?page_id=3 to example.com/services. To add, the site is migrating to new server using same domain name. I thought maybe this could be done via editing permalinks prior to migration and WordPress would update automatically if configured to write on server. But if not configured (as this is not always the case) I do not have htaccess to fix it in case there are suddenly a bunch of 404 errors for every page. Really could use some help on the best procedure to follow in this case. This is the first migration project I've worked on.

    Read the article

  • Is mixing 'Adsense' banners and content okay on a Pinterest style layout?

    - by Theodores
    I was under the impression that Google likes to have their adverts clearly separated out from content so that people don't accidentally click on the adverts thinking they are articles. For a 'pinterest' style layout where you only see the one page and a few pop ups over that one page, you could mix in the adverts with the content, as demonstrated with the two adverts slap in the middle on this site: Clearly this can be done and it exists in the wild, with Google adverts being supplied to the site. However, is that against the spirit and/or the letter of what one signs up to with Adsense?

    Read the article

  • jQuery setTimeout delay for an element

    - by Trouble
    Is there an easier way to wait for an element to load ( by independant script/mootools/other ). For example: I am waiting for a google map to load, but I don't want to use its API for checks. So I made two functions: function checkIfexist() { if(jQuery('#container').length) return 0; else reload(1); } function reload(mode) { setTimeout(function(){ do stuff . . . if(mode==1) checkIfexist(); }, 400); } I am starting it with reload(1); Is there an easier way to use setTimeout in such a way? I don't want to use delay, wait or whatever.

    Read the article

  • Do I really need to remove special characters in a URL?

    - by anarchoi
    I have an FTP account shared with friends where we upload underground music albums and then we use the links to share the downloads in a music forum. The problem is that the album names are in french so there is a lot of special characters in the name. So the URL looks like http://www.mydomain.com/downloads/Some Band - En français avec des caractères spéciaux (2013) [7'' EP].zip For me it works perfectly and I can download the file by using this URL, but I have read everywhere that special chars are bad in URL. Is there any reason why I must remove the special characters or encode the URL? Is everyone able to access a URL with special characters or will some older browsers not be able to download the files? I really don't care about SEO or anything else. I just want the download links to work for everyone. Since the files are uploaded through FTP, I can't use PHP to remove the special characters with a regex, so I really don't know what to do.

    Read the article

  • How Does Domain Know Where Your Web Host Is Located [closed]

    - by icu222much
    Possible Duplicate: How Does Domain Know Where Your Web Host Is Located? I just purchased a domain name from RapidNames, and a hosting plan at JustHost. I was told to enter JustHost's name server (ns1.justhost.com) in my domain name's name server field and wait for 24 hours for the process to be complete. I do not understand how RapidNames can find my account on JustHost's server as I am sure I am not JustHost's only customer. I have read the article How DNS Works that John Conde has posted, but I still do not understand the issue. After reading several other articles, I am beginning to understand how it works, but I would still like someone to confirm if I am correct or not. From my understanding, linking your domain name to your web host is a two step process. First, you need to tell your domain name who your web host is. This is done by providing the two DNS server addresses. Secondly, you need to tell your web host which domain names you own by entering your domain names into the domain name manager. As a result, when someone queries your domain name, they will be forwarded to your web host. The web host will look in their database to match the domain name the account's owner, and then serve the appropriate website. I want to confirm if my understanding of how a domain knows where your web host is located is accurate?

    Read the article

  • 301 redirect from a country specific domain

    - by Raj
    I originally started using a .do domain extension for my site, but later realized that this country specific domain would prevent us from appearing in search results for places outside of the Dominican Republic. We started using a .co domain extension and redirected all requests to the new domain using an HTTP 301. The "Crawl Stats" in Google Webmaster Tools shows me that the .co domain is being crawled, but the "Index Status" shows the number of pages indexed at 0. The "Crawl Stats" for the .do domain says that it's being crawled and the "Index Status" shows a number greater than 0. I also set a "Change Of Address" in Google Webmaster Tools to have the .do domain point to the new .co domain. We're still not appearing in search results at all even for very specific strings where I would expect to find us. Am I doing something wrong?

    Read the article

  • Google Analytics Funnel problems

    - by Alex
    I have a problem with the funnels in Google Analytics. So I have a e-commerce website that I want to track the user path to a purchase. I want GA to track if a user goes trough these steps [Item page]-[Purchase]-[Checkout]. I thought this could be done by funnels and my setup currently now consist of: Step 1: [Item page] (Required) Step 2: [Purchase] Goal: [Checkout] But when I go to the "Funnel Visualization Report" the following shows. [Item page] Visits: 150 [Purchase page] Visits: 170 [Checkout] Visits: 32 How can the [Purchase page] be higher than the [Item page]? I searched the internet over, and found something called Horizontal Funnels but this doesn't show the correct numbers, again the purchase and checkout steps are higher than the item page. So somehow it doesn't need step 1, to fulfill the funnels/goals. What am I doing wrong?

    Read the article

  • How to handle CNAME host redirect to virtual directory?

    - by esac
    I have an internal website and virtual directory http://server2012/logs. I created a CNAME on my DNS server as LOGS - server2012. I would like to set it up so that http://LOGS redirects to http://server2012/logs. Ideally, I would still want it so that all pages appear in the browser as being off from the LOGS URL. So http://LOGS/network.html?site=32 is what is displayed in the browser, but it is really being served from http://server2012/logs/network.html?site=32. I've looked at URL rewrite, but can't seem to get to work.

    Read the article

  • Standalone URL 301 Redirect Manager [on hold]

    - by Lex
    I'm looking for a script with a simple interface that helps me manage a large list of 301 redirects of the form: http://example.com/short ---> http://example.com/long-and-descriptive For example the following Wordpress plugin does this job, but it seems excessive to install Wordpress just for this one simple functionality. It looks like my question is similar to this closed question, but hopefully rephrased in such a way that makes it more relevant and "constructive".

    Read the article

  • With Google DFP (Small Business) is it possible to disable AdSense in an Ad Slot on a per-request basis?

    - by Daniel Pehrson
    Setup: I run a network of websites that target different hobby niches and have a section dedicated to community classifieds. I serve advertising on these sites through Google DFP for Small Business with AdSense enabled on the slots. Problem: One of the next sites in my network will be targeting the firearms/shooting industry and as such the classifieds section will not comply with the prohibited content guidelines of AdSense regarding the sale (or coordination of sale) of weapons. I work very hard to comply with the guidelines of my partners even if I don't understand/agree with them and after talking with many people have decided that the best option is to disable AdSense serving on that section of that website, while leaving it on for the rest of the network. Solution: Right now my only idea for this is to duplicate all my site's ad slots and tack a "_sensitive" onto the end of each one (eg. header and header_sensitive) conditionally registering ad slots based on whether or not I am in the sensitive section of the sensitive site. My hope however is that there may be a way to accomplish this without duplicating all my ad slots possibly with some sort of options to the GA_googleFillSlot() call that allows me to say "load ads from this slot but do not serve AdSense no matter what."

    Read the article

  • External link tracking when opening the link in a new window in Google Analytics?

    - by evanmcd
    OK, so this seems like a really simply problem, but I have yet to find a solution that accomplishes the following: Opens the link in a new window Tracks the event in GA (obviously) Doesn't trigger pop-up blockers (uses target="_blank" instead of window.open) Most of the code I've seen, including Google's, doesn't take into account the case of opening in a new window - they just window location.href. Even GAAddons (http://gaaddons.com/), which charges for commercial use, doesn't seem to work for me. Perhaps, I'm missing something simple - I'd be relieved if so and would thank profusely whoever points it out to me! If no one is able to provide an example, I'll post some of the test cases I've created to illustrate the problem. Thanks.

    Read the article

  • Impact of migrating home page from http to https on search results

    - by 2Stroke SEO
    Guys - I've just had to change one of my site's home page to https from http. I had plenty links coming into the http page and was performing well in Google against many of my targeted search phrases. I did a 301 redirect from the http page to transfer the link juice to the https page (and to prevent duplicate content issues), but my search rankings have tanked which indicates no link juice has been transferred. My PR has vanished - which I'd expect - but I'm really surprised that the SERP rankings fell off the face of the earth. Anyone have any ideas how I can recover this. I've waited a couple of months since the changes took effect just in case Google was taking time to check it out.

    Read the article

  • Apache 2.4 PHP 5.4 MySQL enabled but not found? [migrated]

    - by jurchiks
    Just tried setting up the latest Apache (2.4.1 x86 VC10 from apachelounge) and PHP (5.4 VC9 TS X86, with the php5apache2_4.dll) on my PC and ran into this weird problem; in my php.ini I have enabled all of the following: extension=php_mysql.dll extension=php_mysqli.dll extension=php_pdo_mysql.dll but when I try to print the available PDO's using this: print_r(PDO::getAvailableDrivers()); It gives me an empty array. PhpMyAdmin and MantisBT also refuse to work, saying that the mysql extension is missing. phpinfo() gives this: PDO PDO support enabled PDO drivers no value and when searching for mysql, only the mysqlnd section pops up... The DLLs are there in the D:/php/ext folder and no errors pop up when starting up Apache, as well as no errors in the Windows Event Log, and PHP itself has no error log anyway. What could possibly be the problem with this? Before this I had Apache 2.2.22 from apachelounge and PHP 5.3.9 and I had no problems. Now nothing works...

    Read the article

  • Canonical redirection meta tag [duplicate]

    - by sankalp
    This question already has an answer here: How to use rel='canonical' properly 2 answers There are two pages in my website with the same content; only the URL's are different: www.websitename.com and www.websitename.com/default.html. Someone suggested that I should add canonical tags to avoid them being considered as duplicate content. Where should I add canonical tags and why?

    Read the article

< Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >