Search Results

Search found 3750 results on 150 pages for 'joomla sef urls'.

Page 79/150 | < Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >

  • CMS for coding blog

    - by OrgnlDave
    I've got a server with a LAMP stack and such. I'd like to host a blog-type site (or if there's a free place good for this, that would be cool!) that covers a variety of tutorials, interesting content, etc. There are tons of CMS's out there but if you search for tips on ones that do programming type things well, you get tons of hits about web development. I'd like to know if anyone here has recommendations from actually using a CMS for this type of thing or, short of that, can recommend one - not based on generalities like "Joomla! is great!" I'm looking for the least setup time possible. I'm proficient with CSS and I can design a color scheme, so that's not a big problem. As you can expect, attaching files, pictures, and syntax highlighting are musts (C/C++ ish is good). Ability to group posts, perhaps use tags, etc. would be cool too, but not necessary. As I'm writing this, it almost sounds like it'd be easier to custom-code a small PHP site myself.

    Read the article

  • How to crawl a webPage with dynamic content added by javascript

    - by blunderboy
    I guess there is a news that Google bots have the capability to understand our javascript code. It means this is possible to fully crawl a webpage which has lazy loading feature enabled. I am using Apache Nutch to crawl websites but I don't think it has the capability to fetch the URLs being injected in HTML page by javascript when the page is scrolled down. I see a lot of websites doing lazy loading for performance issue. So Can somebody please explain me how can i crawl the data which comes in HTML page on lazy load. (On scrolling the page down).

    Read the article

  • My blog which gets 300+ daily impressions has stopped appearing on the 1st page of Google

    - by Sangram
    I have a blog regarding Placement papers from December 2010. My monthly impressions are around 4000. For the last 2 days, my blog has disappeared from Google search engine result pages. Impressions have reduced drastically. Please check Stat reports: My blog is still on the search engine because when I search site: mydomain.com on Google, I can see my all pages indexed over there… But my pages which used to appear on the first or second pages of Google do not appear any more. Example: If I search with query GE round 2 code writing test on bing.com or Yahoo search, the first link on the result page is my blog. But if you do same on Google, my URLs do not appear even on the 1st 3 result pages. I used to get lots of visitors by these search query earlier.

    Read the article

  • Apache mpm-itk Performance

    - by Matt Beckman
    I manage a bunch of VPSs with memory ranging from 1GB to 8GB. Most of these websites are Joomla websites, and the servers must support multiple sites/users/S-FTP. I use mpm-itk almost exclusively (mostly due to it's convenience in these shared environments). However, I'm aware it isn't known for performance, so I need some advice on making it faster. Due to the lack of documentation when I first went the way of mpm-itk, I included only one setting in the config, and that was to limit each user to 50 clients (the rest I left up to defaults): <IfModule mpm_itk_module> MaxClientsVHost 50 </IfModule> Are there any better alternatives available? Are there any settings supported in mpm-prefork or mpm-worker that are also supported in mpm-itk? Thanks!

    Read the article

  • wamp - Changing PHP version stops server running

    - by James Connor
    I downloaded wamp which works (green icon). However I need to test a site locally in Joomla 1.5 which caused errors when using php 5.3 I believe I need a PHP version lower then this i.e 5.2.x To do this I have gone through PHP - Version - Get More.. and installed a older PHP version. However when I start this PHP version the icon stays on an orange colour and going in localhost doesn't work. I haven't used wamp before so my knowledge of it is limited. If anyone could point me in the right direction it would be greatly appreciated, Regards.

    Read the article

  • Ubuntu 12.04 LTS Desktop 64 bit user permissions or apache2 rewrite problem

    - by mtm
    have installed 12.04 Desktop 64 bit, manually installed LAMP, phpmyadmin, php5-dev,PEAR, PECL, apc, ssh, created user to own /var/www/ and transferred 3 sites to /www/. sites are in subfolders, sites - available all configured, and enabled. One site is pure html, two athers - php. Enabled curl, but phpmyadmin started at first, also php sites, than stopped working /show blank pages/ sites said Clean urls cannot be enabled. Html site still working. Where is the problem, and why the php sites stop working? In all apache .conf files Allow Override is set to ALL. php sites have .htaccess files. And this configuration worked with Ubuntu 10.04.

    Read the article

  • Need to redirect Wordpress category archives

    - by Scott
    I recently changed my Wordpress category structure a bit, changing some of the names and placing some under different parent categories. I don't use category name in my post URLs, so that's not a problem. But my category archive pages are indexed and have page rank I don't want to lose. So I need to redirect: "/category/old_cat_name" to "/category/new_cat_name". Or in some cases to /new_cat_name/new_sub_cat. I gather that I can't do this though the WP Redirection plugin and that I have to modify my .htaccess. Can someone show me what lines to add there--or is there another better way to do this? Thanks.

    Read the article

  • How can I backup my PPAs?

    - by Scaine
    Related to this question. But my concern is that over the past year, most of my more interesting (or used) applications are from PPAs, and just backing up my sources list won't add the associated launchpad keys the way that add-apt-repository does. So I'm looking for a way to list all the PPA urls (like ppa:chromium-daily/stable) so that I can easily script a series of add-apt-repository commands to add them into a new installation gracefully. Short of dumping my bash history of course. Which might be feasible, depending on how far back that file goes back?

    Read the article

  • What is the best way to have the same website in multiple domains?

    - by Daniel Magliola
    I would like to have the same website to sell a specific product, in multiple domains , to take advantage of keywords matching the domain name, for several different searches. However, I understand that having the same content in multiple sites will unleash the wrath of Google. If I have a redirect from all domains minus one, to that last one, do I still get any bonus for the "magic exact domain match jackpot"? Same question applies to canonical URLs... What's the best way to approach this? Thanks!

    Read the article

  • How to show the right country domain in Google Places?

    - by Baumr
    Background A site has multiple ccTLDs: example.com for the US, example.co.uk for UK users, example.de for Germans, etc. Googling for certain city keywords will return rich snippets with a list of Google Places: Problem When searching on Google Germany, the domain for US users (example.com) appears instead of the corresponding ccTLD (example.de). This is not good user experience, as users would most likely like to book on a site localized for them (e.g. language and currency). Question What solutions are there? Is it possible to return different ccTLDs in rich snippets for Google searches in Germany/UK? Ideas Would implementing the hreflang annotation resolve this? What about entering multiple corresponding URLs in the structured data markup?

    Read the article

  • Disadvantages of a fake phpMyAdmin honeypot that causes ip blacklisting and robots.txt disallow/exclusion of the honeypot?

    - by Tchalvak
    I'm trying to figure out whether I should set up a honeypot system with a fake phpMyAdmin (site gets hits all the time with people spidering for insecurities with that app). My thought was to create a honeypot php script that would mimic a phpMyAdmin login, and then blacklist ips that hit that url (and aren't already whitelisted). I would then add the appropriate urls to the robots.txt so that spiders that actually respect my robots.txt wouldn't be caught by the blacklist. Are there disadvantages to this approach, do legit robots sometimes not respect robots.txt in certain circumstances, are there any problems with this that I should consider in advance?

    Read the article

  • How long does it take for Google Webmasters to index site after submitting sitemap? [closed]

    - by Venkatesh Hodavdekar
    Possible Duplicate: Why isn't my website in Google search results? I have submitted my website today into Google search using Google Webmasters using sitemaps. The status on the sitemap says OK and it shows that 12 urls have been recognized. I was wondering how long does it take for the link to get indexed, as the indexed url option says "No data available. Please check back soon." I am not sure if it is showing this message due to some error, or everything is fine.

    Read the article

  • What are the Consequences for using Relative Location Headers?

    - by Alan Storm
    According to the spec, Location headers used in a redirect require a server name HTTP/1.1 301 Moved Permanently ... Location: http://example.com/foo/baz/bar However, in 2012, most web browsers will recognize a relative path and redirect you to the new location using the original server name HTTP/1.1 301 Moved Permanently ... Location: /foo/baz/bar Are there any negative/surprising consequences to using the relative URLs in the Location headers? My particular concern is how Google/search-engines will interpret this, but if there's anything else I'm not thinking about I'd love to hear it.

    Read the article

  • Problem downloading .exe file from Amazon S3 with a signed URL in IE

    - by Joe Corkery
    I have a large collection of Windows exe files which are being stored/distributed using Amazon S3. We use signed URLs to control access to the files and this works great except in one case when trying to download a .exe file using Internet Explorer (version 8). It works just fine in Firefox. It also works fine if you don't use a signed URL (but that is not an option). What happens is that the IE downloader changes the name from 'myfile.exe' to 'myfile[1]' and Windows no longer recognizes it as an executable. Any advice would greatly be appreciated. Thanks.

    Read the article

  • Job Search Engine Url Structure Issue [closed]

    - by Justin
    Possible Duplicate: What is the best stucture of SEO friendly URL? I am working on a job board, and i'm trying to figure out a good design for URL structure. Some things that I have found through research: 100 - 150 Chars long is ideal 3-5 words in your url, according to Matt Cutts Use .htaccess to force clean urls Do not duplicate data (important) Clean and precise, describing the content Use hyphens On the homepage, I try to detect the users location based on IP, but this isn't always accurate, and not always reliable. So until they put in their city/location, I can't always use this structure but this is potentially work-able. For Searching, a form post to a results page: domain.com/jobs/[city]/[search] ie: domain.com/jobs/toronto/sales manager/ OR domain.com/search/jobs/toronto/sales manager/ or do I remove the word JOBS and just use Search. I trying to keep good search terms in the URL, but also keep it clean and concise. Can someone give me some feedback and thoughts to 'why'...

    Read the article

  • Installing Oracle 11g SOA Suite?

    - by asantaga
    Are you working for an SI like Accenture or Cap Gemini? Are you a sales consultant who needs to install software quickly??? Well I’m sure if your reading this you probably are.. Anyway if your like me, and like many tecchies reading manuals isn't natural to us, we’ll download the software, try to install it and then… ultimately fail.. or take a lot longer than it should..  However never fear help is here! For Oracle 11g SOA Suite (ps3) a good friend of mine , a SOA 11g PM in the states, has written a document, a quick start and its on OTN.. Although the document is PS3 focused, apart from the download URLs its also totally applicable for PS4 too. The document can be found at this link

    Read the article

  • securing unpatched websites

    - by neuron
    I have a client with a lot (read several thousand) websites in several old cms solutions that are no longer maintained. Now moving all of them to a maintained solution isn't really an option at this point. So I'm thinking about ways to secure the solutions without patching them. The solutions are mostly joomla 1.0/1.5 and wordpress. What I'm thinking is something like this: mod_suexec to lock everyone into their own home directory apparmor to deny any and all file writes by default. (exclude by default, include things like "images" directories). use htaccess to prevent anything in writable directories from being executed. (aka disable php_engine for images/ directory). mysql triggers to check the "users" tables to prevent adding new admins/superadmins. Does this make sense? Is it viable? Am I missing something obvious?

    Read the article

  • Moving one site in Webmaster Tools to more than one site [closed]

    - by Towhid
    Possible Duplicate: How should I structure my urls for both SEO and localization? I have a Question and Answer site about immigration. now I divided it into 2 sites: mysite.co.uk about immigration to UK mysite.com with sub domains for every country, Like: australia.mysite.com , sweden.mysite.com , ... now I had moved All the content from my first site into .co.uk and .com site and it's sub domains to fill theme. I now that Google will detect my new 2 sites as duplicate of first on and it is very bad for SEO. and I don't think Google webmaster tools has a tool for it. so Please Guide me how to fix this problem.

    Read the article

  • Any frameworks or library allow me to run large amount of concurrent jobs schedully?

    - by Yoga
    Are there any high level programming frameworks that allow me to run large amount of concurrent jobs schedully? e.g. I have 100K of urls need to check their uptime every 5 minutes Definitely I can write a program to handle this, but then I need to handle concurrency, queuing, error handling, system throttling, job distribution etc. Will there be a framework that I only focus on a particular job (i.e. the ping task) and the system will take care of the scaling and error handling for me? I am open to any language.

    Read the article

  • Every file on cPanel got deleted (then restored hours later), and I have no idea why

    - by mcranston18
    I apologize in advance if I don't provide proper detail; I am new to server stuff and am looking for general advice about this issue: I was helping out a client doing web design last month. They have about a dozen static sites on one server. The sites are all built on Joomla, except one which I built on Wordpress. Everything was working fine last month when we did the redesign but all of a sudden this morning, every single file on their server got deleted: every web page, file, and all e-mail addresses. I phoned the hosting company (alliancewww.com) to ask, "why did every single file suddenly delete off the server?" They said, "because someone must have deleted it." I said, "well no one did." (Which I'm pretty damn sure no one did.) They said, "you can pay us to look into the problem." I authorized $150 for them to look into the problem. About an hour later, everything was magically re-instated. The host said they had a back-up of everything and just restored everything. What I'm wondering: Does anyone have recommendations of logs I can go through to investigate how the files got deleted in the first place? I've checked out their cPanel logs but found nothing. Is it likely that this is a mess-up on the host's part?

    Read the article

  • Getting the keyword as a parameter from Adwords using ValueTrack

    - by Stephen Ostermiller
    I set up an AdWords campaign for website following the instructions for Google AdWords ValueTrack. One of the things that it is supposed to be able to do is pass the keyword as a URL parameter using the code {keyword} in the URL. I set it up for integration with Google Analytics such the landing URLs would look like: http://example.com/landing.html?utm_source=adwords&utm_medium=cpc&utm_term=%7Bkeyword%7D&utm_content=my_content&utm_campaign=my_page where {keyword} is in the utm_term parameter. Hower, this keyword substitution isn't happening. Why?

    Read the article

  • URL-rewriting on Plesk using ISAPI_rewrite3 Lite

    - by Anusha
    I am using Plesk Windows based web server with Windows 2008 server OS with IIS-6 for my e-commerce website. I want to rewrite URLs for all dynamic pages, So I installed ISAPI_Rewrite 3 Lite on my web server also I had uploaded the .htaccess file with the basic rules as follows RewriteEngine on RewriteRule ^contact\.html$ contactus.php? [NC,R] I never worked before with ISAPI neither on URL- rewriting. My doubt is How should I proceed after installation. Should I upload .htaccess or httpd.conf file OR This s/w has ISAPI_Rewrite Manager which gives place to edit httpd.conf, Should I write rules on this. Anyways I had tried all these steps but unfortunately I couldn't find any remedies. Any immediate solution will be appreciable.

    Read the article

  • Few New Features Added to Geekswithblogs.net

    - by Staff of Geeks
    After reviewing some of the feedback from our bloggers we added a couple new features to Geekswithblogs.net and there are still more to come.  Here is a list of the features we added.   Fixed the Twitter parser to better support URLs and Hash Tags Added some hooks behind the scenes to tags posts with common keywords automatically Added Facebook likes and Tweets to the bottom of every post Cleaned up a few skins Images on the main page for bloggers who use Gravatar or Twitter integration Random bug fixes based on Log   We are definitely working to make Geekswithblogs.net faster and better.  If you have any suggestions, please feel free to share them with the team.  On a side note, if that suggestion is move to WordPress, I will reply to you with stop writing ASP.NET for your day job and move to PHP.  That request is the equivalent in my eyes.  If we have enough bloggers leave the Microsoft .NET Platform for their main source of income, we might consider it.   Technorati Tags: Geekswithblogs.net,Features,Version 4.0

    Read the article

  • Rewrite rule to show as directory using .htaccess

    - by chanchal1987
    I want to implement a rewrite rule in my .htaccess file to show a specific url as a directory of my server. See the code below I written, RewriteRule ^(.*)/$ ?page=$1 [NC] This will rewrites urls like www.mysite.com/abc/ to www.mysite.com/index.php?page=abc. But if I request www.mysite.com/abc then it is throwing an 404 error. How can I write a rewrite rule which will match www.mysite.com/abc and www.mysite.com/abc/ both? Edit: My current .htaccess file (After Litso's answer's 3rd revision) is like below: ## ErrorDocument 401 /index.php?error=401 ErrorDocument 400 /index.php?error=400 ErrorDocument 403 /index.php?error=403 ErrorDocument 500 /index.php?error=500 ErrorDocument 404 /index.php?error=404 DirectoryIndex index.htm index.html index.php RewriteEngine on RewriteBase / Options +FollowSymlinks RewriteRule ^(.+)\.html?$ $1.php RewriteCond !-d RewriteRule ^(.*)/$ ?page=$1 [NC,L] RewriteCond %{REQUEST_URI} !index.php RewriteRule ^(.*)$ ?page=$1 [NC,L] ##

    Read the article

  • google webmaster showing 6 pages submitted 0 indexed, yet i can see them all there when i search in google?

    - by sam
    I have a small 'brochure' type site with 6 pages, i can see them all the pages showing up in google when i search for my site. But in webmaster tools under the sitemaps section it says 6 submitted, (the blue bar of the graph), but the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed ? any idea why this is ? I dont really think its that important as the pages are still indexed, but it just seems odd. =================================================== UPDATE 9/3/12 having just looked in google webmaster its showing that there are 11 pages indexed, under the health index status tab.. but under the optimization sitemap tab it shows 6 urls submitted but only 1 indexed ? please see images bellow index status: Sitemap status:

    Read the article

< Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >