Search Results

Search found 14053 results on 563 pages for 'upk pro (knowledge pathwa'.

Page 73/563 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • Site Search Engine for 1,000 page website

    - by Ian
    I manage a website with about 1,000 articles that need to be searchable by my members. The site search engines I've tried all had their own problems: Fluid Dynamics Search Engine Since it's written in perl, it was a bit hacky to integrate with my PHP-based CMS. I basically had to file_get_contents the search results page. However, FDSE had the best search results. Google CSE Ugh, the search results SUCK. It can't find documents even using unique strings. I'm so surprised that a Google search product is this bad. Nor can I get any answers on their 'help' forums, and I am a paying user. Boo, Google. Boo. Sphider Again, bad search results. Unable to locate some phrases used in link text. Better results than Google CSE though. Shame on Google that a free PHP script has better search results than their paid application. IndexTank This one looked really promising. I got all set up with their PHP API client. But it would only randomly add articles that I submitted. Out of 700+ articles I pushed to the index through their API, only 8 made it in. Unable to find any help on this subject. Update for IndexTank -- Got the above issue fixed, so this looks most promising so far. The site itself runs on php/mysql and FreeBSD, though this shouldn't matter for a web crawling indexer. I've looked at Lucene, but I don't know anything about Java or installing Java programs on my web server. I also do not have root access on my web server, if this would be required for installation. I really don't need a lot of fancy features. It just needs to be able to crawl my web site and return great (even decent!) search results. I don't need any crazy search operators. It doesn't need to index off my primary domain. It just needs to work! Thanks, Hive Mind!

    Read the article

  • Link a domain to a Squarespace account

    - by TheMightyLlama
    I have set up a Squarespace account and now need to link a domain I purchased elsewhere. What settings do I need to change on the domain hosted? This is what I've got so far, however the changes don't seem to have propagated and my website still has a default page: Here are the settings that Squarespace say I need to use for GoDaddy, but I'm on A2Hosting and confused as to how to configure everything.

    Read the article

  • Rackspace Cloud Servers in Europe?

    - by mit
    We have setup a cloud virtual server at rackspace in the US, but we use it from Europe. I found out I am not quite happy with the response time. Of course I knew that there would be some latency. But I am not sure if it is the overseas latency (ping is 120ms) or also the minimal resources. It is the smallest machine, 256 MB, 10 GB, running a Mediawiki on Ubuntu 10.04 64 bit. The Instance lives in the rackspace ORD1 datacenter. As soon as they have opened their new facilities in the UK we plan moving the incstance there. But we are planing more machines already. The pricing is quite attractive. I don't really want to do some measuring and benchmarking and this stuff, so I am asking just for your opinions and it would be nice to hear what you can tell from your experience. Maybe someone who uses such small instances in the US. And what can we really expect if we upgrade to more resources.

    Read the article

  • What is the best practice with KML files when adding geositemap?

    - by Floran
    Im not sure how to deal with kml files. Now important particularly in reference to the Google Venice update. My site basically is a guide of many company listings (sort of Yellow Pages). I want each company listing to have a geolocation associated with it. Which of the options I present below is the way to go? OPTION 1: all locations in a single KML file with a reference to that KML file from a geositemap.xml MYGEOSITEMAP.xml <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:geo="http://www.google.com/geo/schemas/sitemap/1.0"> <url><loc>http://www.mysite.com/locations.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> </urlset> ALLLOCATIONS.kml <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document> <name>MyCompany</name> <atom:author> <atom:name>MyCompany</atom:name> </atom:author> <atom:link href="http://www.mysite.com/locations/3454/MyCompany" rel="related" /> <Placemark> <name>MyCompany, Kalverstraat 26 Amsterdam 1000AG</name> <description><![CDATA[<address><a href="http://www.mysite.com/locations/3454/MyCompany">MyCompany</a><br />Address: Kalverstraat 26, Amsterdam 1000AG <br />Phone: 0646598787</address><p>hello there, im MyCompany</p>]]> </description><Point><coordinates>5.420686499999965,51.6298808,0</coordinates> </Point> </Placemark> </Document> </kml> <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document> <name>MyCompany</name><atom:author><atom:name>MyCompany</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/22/companyX" rel="related" /><Placemark><name>MyCompany, Rosestreet 45 Amsterdam 1001XF </name><description><![CDATA[<address><a href="http://www.mysite.com/locations/22/companyX">companyX</a><br />Address: Rosestreet 45, Amsterdam 1001XF <br />Phone: 0642195493</address><p>some text about companyX</p>]]></description><Point><coordinates>5.520686499889632,51.6197705,0</coordinates></Point></Placemark> </Document> </kml> OPTION 2: a separate KML file for each location and a reference to each KML file from a geositemap.xml (kml files placed in a \kmlfiles folder) MYGEOSITEMAP.xml <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:geo="http://www.google.com/geo/schemas/sitemap/1.0"> <url><loc>http://www.mysite.com/kmlfiles/3454_MyCompany.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> <url><loc>http://www.mysite.com/kmlfiles/22_companyX.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> </urlset> *3454_MyCompany.kml* <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document><name>MyCompany</name><atom:author><atom:name>MyCompany</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/3454/MyCompany" rel="related" /><Placemark><name>MyCompany, Kalverstraat 26 Amsterdam 1000AG</name><description><![CDATA[<address><a href="http://www.mysite.com/locations/3454/MyCompany">MyCompany</a><br />Address: Kalverstraat 26, Amsterdam 1000AG <br />Phone: 0646598787</address><p>hello there, im MyCompany</p>]]></description><Point><coordinates>5.420686499999965,51.6298808,0</coordinates></Point></Placemark> </Document> </kml> *22_companyX.kml* <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document><name>companyX</name><atom:author><atom:name>companyX</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/22/companyX" rel="related" /><Placemark><name>companyX, Rosestreet 45 Amsterdam 1001XF </name><description><![CDATA[<address><a href="http://www.mysite.com/locations/22/companyX">companyX</a><br />Address: Rosestreet 45, Amsterdam 1001XF <br />Phone: 0642195493</address><p>some text about companyX</p>]]></description><Point><coordinates>5.520686499889632,51.6197705,0</coordinates></Point></Placemark> </Document> </kml> OPTION 3?

    Read the article

  • include component inside an article in joomla

    - by user1212
    How can one include a component inside an article in joomla. I have tried using the Plugin Include Component in Joomla. I have a Phoca guestbook set up which works fine on its own. But when I try to include it inside an article using {component url='index.php?option=com_phocaguestbook'} It just does not display any component. This same procedure will work fine for other components. Any idea about what i am missing here?

    Read the article

  • Different behaviour with windows authentication on IIS7 websites

    - by amaters
    I need to run a website with just windows authentication. Given the following situation: The location of the default website is: c:\inetpub\wwwroot The location of my code is: c:\Sites\WebApp my hostfile is edited so any .local i use points to 127.0.0.1 I have created a new application called 'AppX' underneath the default website and point it to c:\Sites\WebApp. It will use the DefaultappPool. When I switch off anonymous and switch on windows authentication all works well when I go to localhost/AppX/. What i really want is a new website (No need to question why I want this). So I created Website2 and did exact the same creation of the application. Everything is the same; destination, app pool and authentication. Now when I browse to this website web2.local/AppX/ I get the 401.2 - Unauthorized error. What am I missing here?

    Read the article

  • Determine web page draw time via a program

    - by Kevin Burke
    Google Chrome has a nice tool to determine the time the page begins drawing, in the Network tab in Developer Tools. Similarly sites like webpagetest.org can tell you the draw time and give you the whole waterfall of page loads for a given web page. I was wondering if I could automate the process of finding the time it took to the first page draw, for all of the pages on my site, so I can share this data within my company. Obviously the page draw time will depend on the latency and throughput of your connection, but I'm more concerned with the relative data about pages on our site. Can I get this data from Selenium or another tool? Thanks, Kevin

    Read the article

  • Adding an ASP website in IIS7.5 on Windows 7

    - by birdus
    enter preformatted text hereI'm trying to add an ASP website under IIS 7.5 on Windows 7 and am having no luck so far. This site is just for me to hit locally. I need to make some changes to some of the HTML in some of the ASP files and I just need to be able to test my changes as I make them. I installed IIS and checked the box for ASP. Next, I added an Application Pool which I called ASP and which has "No Managed Code" and "ASP" set. Next, I added the website by right-clicking "Sites" then clicking "Add Web Site...". I gave it a name, set it to use the ASP app pool, pointed it to the path where the ASP code is (I left it at pass-through authentication), and typed in 5555 as the port, so as to not interfere with the default website. The code is sitting on my server and the path simply uses the mapped drive that I always use to access files on that drive array. When I type in http://mysite:5555, I get "could not find mysite:5555". I don't really know if all these settings are correct or what else I should try. What am I missing? Thanks, Jay

    Read the article

  • Why does display:table-cell not center content without display:table?

    - by Samuel
    I'm looking for the most efficient (or elegant) way to vertically and horizontally center content of variable height. I've come up with this: http://cssdeck.com/t/2veysdkg/16, which uses css tables to vertically center the main content. My demands for writing this particular piece of code were: Must be able to center variable and fixed width content vertically and horizontally Centered content must be inside the normal document flow (so no overlapping) Sticky footer and normal header of 100% width As few hacks, ugly code or non-semantic html as possible I didn't care about support for IE6, IE7 (I'll use a different stylesheet for them) The weird thing is that the demands above are only met when the header and footer are set to table-row, and the body-tag to display:table. Which is weird because as I understand it the css will generate anonymous table elements when parent table elements are missing. So table-cell should work without all the surrounding elements, but yet I've not been able to make it work. If it were up to me I would prefer to not mess with the display mode for the body tag, and leave the header and footer on display:block. But I've not been able to make it work. Does anyone understand why this doesn't work?

    Read the article

  • Spam prevention through IP tracking

    - by whamsicore
    I am building a website with user generated comments. In order to implement user moderation/spam-protection, users have the ability to mark comments as spam. When one comment is marked as spam, I want all comments from the same IP address to be deleted. I am not familiar with spam prevention in general, other than Captcha. Question: is this a feasible/good system for spam prevention? are there better ways, or improvements I can make? Thanks.

    Read the article

  • Remove urls to unidex blog content from google, then copy blogs content to new blog [closed]

    - by sam
    Possible Duplicate: migrating PR / rankings from one site to another Ive been writing a blog for the past yr or so, with about 300 published articles, the blog have been running under a subdomain blog.mysite.com We are no looking to change the url of our site, so the blog is going to have to be ported over to a subdoamin on the new site. We would really like to keep the backlog / archive of all the articles we have written but dont wont to be penalized for having duplicate content, could we just remove / unindex the urls from google in webmaster tools then export the blog and import it back to our new blog ? Would google still see this a duplicate content or becuase ive removed the urls have they no longer got a copy of it ? thanks

    Read the article

  • Some Adsense domain's ads are causing document.write() statements that remove the html from the page

    - by er1234
    All that is output on the page is the domain name of the advertiser, for example 'www.solar-aid.org'. The rest of the content is stripped, I believe because of a document.write() statement. I'd like to know if this is a common issue or something wrong with our setup. There are three domains causing the issue, which we've blocked from Adsense as a result. solar-aid.org kiva.org grameenfoundation.org Given the type of organizations I think they may be within the default group of 'public service ads' within the Backup Ads setting. If the issue doesn't completely resolve itself soon (one customer of ours complained today, even though I blocked them 5+ days ago), I'll disable public service ads and select the 'fill space with a solid color' option.

    Read the article

  • Cheap Bulk Domain Registration

    - by Panoy
    I have 6-7 domain names that I have thought of and I'm planning to buy it in bulk so that I can save. Or am I wrong on this? In my case, since its my first time to this hosting/domain registration, I only knew of GoDaddy with regards to domain registration. Questions: Will I lose out if I chose a cheap domain registrar compared to one that's popular? For a newbie like me, what companies can you recommend for me to register domain names in bulk for cheap or affordable price? I notice that some prices are higher because they offer support and customer service? Aren't those servers not reliable at all? I've heard of some domain registrars that they're increasing their prices every renewal? Is that just natural in a business sense for these domain registrars? Before posting this, I've been reading about NameCheap.com, and I'm considering registering for them unless you have other good choices to give me. I'll appreciate every suggestion or advice you can give.

    Read the article

  • Why does max-width behave counter intuitively on columns in a table? [migrated]

    - by Nate
    Basically, I have a stretchy table, I want my label column to be fixed width and my data column to be dynamically sized. My inclination would be to set the max-width via CSS on my label column. However, this has the opposite effect. I've created a jsfiddle that replicates this. (Re-size the window to see the left column dynamically sized and the right column fixed size) On my own site, I see the same behavior and it happens in IE and Chrome. If I switch it, and set max-width on the data column, everything behaves as I want, but it feels backwards to me. Am I doing something wrong here?

    Read the article

  • Should my URLs be lowercase?

    - by Rowan Freeman
    According to this blog ("Understanding SEO Friendly URL Syntax Practices") I should change http://example.com/Hello-Dolly To http://example.com/hello-dolly The reasons given are: URLs, in general, are case-sensitive it will simplify any case sensitive SEO and analytics reports According to this GIF that I found on Wikipedia's article on URL Normalization I should convert my URLs from any uppercase to all lowercase. However I use ASP.NET MVC4 and by default my URLs are structured like this (CamelCase): http://www.domain.com/Controller/Action/Parameter http://www.greatsite.com/Categories/List/Bicycles I've skimmed through the RFC1738 but I didn't see any definitive answers to this. Should I go out of my way to force the framework to change everything to lower case? Why did Microsoft choose to design their framework like this if everybody is telling me to use lowercase?

    Read the article

  • Blogging platform for shared hosting?

    - by Ankit
    First of all, this is not for flame wars. So, please do not bash some blogging platform :P I have a shared hosting account. I want to setup a blog on it. Now, the thing is that as it is shared hosting, I do not have as much bandwidth as some expensive hosting. I have tried using Wordpress with the caching plugins configured. It still does not seem to speed it much. Also if I create a simple PHP website, the speed is fine. Can someone suggest me some lightweight platform?

    Read the article

  • How do you find all the links to disavow for a Google reconsideration request? [duplicate]

    - by QF_Developer
    This question already has an answer here: How to identify spammy domains giving backlinks to my site (to submit in disavow links in WMT) 2 answers A few months ago I received the following notification on Google Webmaster for a website I look after. Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. The question here is, should we actively attempt to disavow these links given that the action is seemingly targeted to just a bunch of keywords? I've downloaded the inbound links sample from Google Webmaster and so far I've been through the disavow and reconsideration requests process 6 times, each taking 2-3 weeks only to be supplied just 2 more links that Google don't approve of. At this rate it will take me the rest of my natural life to cleanup all these spammy links! It seems disavowing is futile as they haven't implemented broad actions against the website as a whole and (from what I can gather) have already nullified the value of those offending links. Under the quoted statement above however is a reconsideration request button that seems to imply I should be actively doing something here? UPDATE 14th October -- I have since created a small .NET application that you can feed the CSV sample links file into from Google Webmaster. What this tool does is crawl all the links and looks for specific linking patterns as per some configurable match strings. I realised that many of the links that Google are taking issue with were created by a rogue SEO firm we hired several years ago. All the links are appended with 1 of 5 different descriptions. The application I built uses some regexes to isolate any link sources with these matching appendages and automatically builds the disavow txt file. In the end it had to come down to an algorithm as manually disavowing links on this scale would take weeks! I will post the app here once I've cleaned it up.

    Read the article

  • IIS MIME type for XML content

    - by Rodolfo
    recently a third party plugin I'm using to display online magazines stopped working on mobile devices. According to their help page, this happens for people serving with IIS. Their solution is to set the MIME type .xml to "application/xml". It's by default set to "text/xml". Changing it does work, but would that have unintended side effects or is it actually the correct way and IIS just set it wrong?

    Read the article

  • How should I update my name server after I installed a new dedicated server?

    - by Jim Thio
    Say I got a dedi. The IP is 123.123.123.123 Now I got domain name domainname.com that will be the "main" domain name for that server. Should I? Set the name server of the domainname.com to ns1.domainname.com and ns2.domainname.com Add child nameserver ns1.domainname.com and ns2.domainname.com to point to that exact IP. or Should I? Point the name server to my registrar name server. Set an A address of the name server to point to my IP. Which one is right? Obviously I want ns1.domainname.com and ns2.domainname.com to point to my IP so I can then point hundreds of domains to that IP. But how exactly I should do that? Specifically I simply use cpanel. Centosh with cpanel.

    Read the article

  • Could Ajax + Caching be seen as cloaking?

    - by Angel
    I have a website where we use a technique to speed up loading times based in a combination of AJAX + caching. Basically, when we have a section in a page with content which is slow to retrieve, we first look if it's cached. If it is, then we serve the content, if it's not, we serve a placeholder and then make an AJAX call in the client to retrieve the content, wich is now cached for subsequent requests. As a consecuence, sometimes you get the entire page content in the first request, and sometimes you get those placeholders, wich get filled inmediatly with the responses of the AJAX request. You can see an example in the results count by category in the right column of this page: http://www.inzoco.com/crits/2-1-3-28-185-0-28079-0-0/listado-piso-en-alquiler-en-madrid-madrid.aspx I'm worried if it could be seen as cloaking by search engines because if you make a request for a page wich content isn't cached and then ask again for the same page, you would get different responses, the first with the placeholders and AJAX requests and the second one with al the content rendered.

    Read the article

  • Does hiding images on 404 error affect SEO?

    - by Question Overflow
    I have a dynamic website that allows registered users to upload and display images on the their profile page. As each user may upload less than the maximum limit of 20 images, there would be some "empty" images on the page. I am using javascript to hide these empty images. The loading of the profile page would generate a series of 404 errors depending on the number of empty images. Would these 404 errors affect the SEO of the page and the website?

    Read the article

  • Ranking hit after site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed Google Webmaster Tools completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic comes from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings.

    Read the article

  • Pins on Pinterest link to 404 - yet link is valid

    - by Damian Rees
    this is a really strange one. I've set up a bunch of pins on Pinterest linking through to our services which all work fine. Then I decided to do the same on our blog articles (we use Wordpress), yet everytime I click the link (and I've done this on different computers) the link goes to a 404 page on our site. However the link is valid and if you right click the pin and open in new window it opens fine. I have contacted Pinterest who are next to useless. I have also tried different browsers, different computers and different Pinterest accounts. I can't see any weirdness in my htaccess files causing this so I'm a little stumped. Any suggestions?

    Read the article

  • Thousands of 404 errors in Google Webmaster Tools

    - by atticae
    Because of a former error in our ASP.Net application, created by my predecessor and undiscovered for a long time, thousands of wrong URLs where created dynamically. The normal user did not notice it, but Google followed these links and crawled itself through these incorrect URLs, creating more and more wrong links. To make it clearer, consider the url example.com/folder should create the link example.com/folder/subfolder but was creating example.com/subfolder instead. Because of bad url rewriting, this was accepted and by default showed the index page for any unknown url, creating more and more links like this. example.com/subfolder/subfolder/.... The problem is resolved by now, but now I have thousands of 404 errors listed in the Google Webmaster Tools, which got discovered 1 or 2 years ago, and more keep coming up. Unfortunately the links do not follow a common pattern that I could deny for crawling in the robots.txt. Is there anything I can do to stop google from trying out those very old links and remove the already listed 404s from Webmaster Tools?

    Read the article

  • Issue with Godaddy DNS manager

    - by Fischer
    I'm using domains.live.com to setup an email to a domain registered on Godaddy. The domains.live.com configuration page says: Godaddy's DNS manager isn't accepting this string Value: v=spf1 include:hotmail.com ~all it gives an error, something is wrong, either with the string or with the DNS manager and I would like to know how to fix it. Notes: The more information link is dead, Godaddy no longer gives support by email, no Microsoft support

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >