Search Results

Search found 10402 results on 417 pages for 'macbook pro'.

Page 178/417 | < Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >

  • Aliasing resellers domain to primary domain

    - by Ashkan Mobayen Khiabani
    I have designed a website that accepts re-sellers and actually the concept of this website is having local re-sellers for each province (or should we say branches). I have designed this website in a way that anybody who has a domain, can point to our website (a record or cname). well most of the website content are the same, the only difference is that re-sellers website doesn't have some items on the main menu and may have some small descriptions of their own branch in some pages. I read that Google may ban websites with duplicate content (or which are significantly similar). I want to know will this be a problem for me? If yes, what else can I do? we have had considered asking our reseller to use iframe that loads our website but wanted that each reseller can have its own SEO and try harder but what I read about this duplicate thing worries me.

    Read the article

  • Best scripting language for project [on hold]

    - by Dave
    This is a subjective question, but I don't know where else to ask it. I'd appreciate it if someone could direct me to an appropriate scripting language for my project. I'm a little new at this so I'd appreciate any help. The project is a website that will display a list of photo subject groups (such as "nature" "people" "sports" etc) on the home page. The photos will all be in subdirectories of the main photo directory (photos) and each subject group will represent a subdirectory in photos. For example in directory photos there might be 3 subdirectories, "nature" "people" "sports" and in each of those subdirectories there will be the actual photos. The idea is that when the website owner wants to update/add/delete a subject group all he has to do is add, delete or update a subdirectory of the photos directory. This means, I think, that I need a scripting language that can read the directories and files in the website and then send a web page with the information in it. What is the simplest and easiest scripting language to do this in? Any ideas? Thanks

    Read the article

  • SEO disasters moving domain for a high traffic website?

    - by chrism2671
    We're looking at moving our website from http://www.wikijob.co.uk to http://www.wikijob.com/uk as we spread our wings internationally. Our .co.uk website has a PR6 and received around 1/2 million visitors a month, 40% international. The wikijob.com domain, while registered for a while, has not been used nor promoted. I am concerned that moving domain could really haemorrhage our traffic and result in a loss of goodwill from Google, even if we use a 301, but equally, if we could transfer that pagerank to the .com domain, that would give us a massive head start around the world. Should we do it, or should we start over with .com and leave .co.uk as is?

    Read the article

  • Why is Google still not indexing my !# website?

    - by Zubair
    I have been working on a website which uses #! (2minutecv.com), but even after 6 weeks of the site up and running and conforming to the Google hash bang guidelines stated here, you can still see that Google still hasn't indexed the site yet. For example if you use Google to search for 2MinuteCV.com benefits it does not find this page which is referenced from the homepage. Can anyone tell me why Google isn't indexing this website? Update: Thanks for al lthe help with this answer. So just to make sure I understand what is wrong. According to the answers Google never actually indexes the pages after the Javascript has run. I need to create a "shadow site" which google indexes (which google calls HTNL snapshots). If I am right in thinking this then I can pick a winner for the bounty

    Read the article

  • remove ssl from Google search results

    - by user73457
    I am the webadmin of a WordPress site that serves up http pages statically. The problem is that some of the pages are shown as https in Google search results. For instance, if the search term "Example Press Kit" is entered the search result site link comes up as: https://example.com/presskit/ We don't have a site ssl certificate, so surfers are being bounced. I have tried everything. Most recently I created a new website in Google WebAdmin for the https version of our home page. Then, I added sitelinks that should have redirected site links intended for https://example.com/* to http://example.com/*. But it doesn't work! Google still shows a dead link to http://example.com/presskit. I didn't think dead links lasted very long on Google results, but there they are, two weeks later. Any ideas?

    Read the article

  • HTML background-size:cover with floating objects

    - by Mikhail
    I have a trivial page with body having an image background, with background-size:cover. I set html { height:100% } to fill up the entire page regardless of the content amount. Up to this point everything worked as expected. I've added a div and set position:absolute; right:0; width:200px; This, again, worked as expected, until I added content. When this div is populated so much that the contents take up more space than the height of the page, the scroll bar appears. Scrolling down reveals that the background image does not actually cover the entire page. This is due to the fact that my div is taller than 100% of the HTML height. How can I address this?

    Read the article

  • Where to find a template or script with frame on the left side(list of articleHeadlines) and on the right side the content

    - by Gero
    I am looking for something like the following: http://www.scala-lang.org/api/current/index.html#scala.Any http://resources.arcgis.com/en/help/arcobjects-net/componenthelp/index.html#/Overview/004t00000009000000/ On the left side i want to have/create in some admintool categeries, subcategories and add names/links to the articles on the right side. So when i click on one of the articles/links, i would see the content on the right side. Is there any script or template or whatever that would allow me that?

    Read the article

  • parallels plesk 11 missing web presence builder

    - by NRGdallas
    after a recent upgrade to parallels plesk 11, we decided to start using their web presence builder tool, however every video, documentation, and instructional I view, shows the link should just be under websites and domains, or even on the homepage. It is in neither location. I have verified it is both installed and up-to-date, under server - updates and upgrades any idea how I access the web presence builder?

    Read the article

  • What marketplace / garage-sale software package does togoparts.com use?

    - by gus
    See: OpenSource Marketplace Platform I want to start a site also for end-users to buy/sell used sporting goods of a particular type. When the scope of goods is narrowed like this, it is very advantageous to be able to filter by Brand, Size, Price Range, etc. Nice features: account reputation with user comments listings sortable by many custom fields auto resize and recompress image uploads I don't want to reinvent the wheel, so does anyone know where I can start?

    Read the article

  • What kind of spam is this?

    - by SSilk
    I realize this is a pretty vague question, but I occasionally get spam messages through my contact form on a Drupal 6 site. The contact form does not have any anti-spam protection (i.e. math question). The messages I get are all very similar and just jumbled junk, like below, so I think they're all from the same source. Example: ylsaf0V bpsdfuxnhjjd, [url=http://wwgfsggzgyjyjm.com/]wwgrfgzrgsjyjm[/url], [link=http://xmgvyghcuufvb.com/]xmjyhvyjyfjirovb[/link], http://frgxmdghrgruhfc.com/ Anyway, I'm just wondering what the point of such a message is. All the links are dead, it's illegible, and it's not trying to sell me a product or get me to do anything, so I'm a bit perplexed. Is there any way to tell where they're coming from? And how concerned should I be? To be clear, I'm not asking how to avoid them, I realize just adding a simple math challenge or captcha would likely do the job.

    Read the article

  • What zoomable image viewers are there for websites?

    - by tog22
    What zoomable image viewers are there? By these I mean tools that one can embed in a website to let a user zoom in on and pan around a high res image, a canonical example being http://www.zoomify.com/ (see the demo on their home page). Comments on them are welcome. I'm personally looking for something simple and cheap/free which ideally doesn't require Flash, and will accept the answer that comes closest to these requirements. But others who find this question may have different requirements, so all suggestions will be helpful. I have of course searched; I've found Zoomify, http://www.openzoom.org/ and http://code.google.com/p/galapix/ but none seem to meet my requirements, though I could be wrong and others may have more expert comments on these.

    Read the article

  • Setting folder to be writable by apache/php in windows?

    - by Chris Sobolewski
    I have a local test server, and I am attempting to write a file with PHP. I am getting a message that the folder (../uploads/) does not exist or I do not have permission. My directory structure is D:\xampp\htdocs\website\ //<--root D:\xampp\htdocs\website\library //<--where script runs D:\xampp\htdocs\website\uploads //<--where I'd like to save I know on a *nix server, I can just chmod the permission to 0777. What do I need to set on my windows box to give apache the ability to write a file?

    Read the article

  • Is PPC effective marketing Web Design & Development Services

    - by Pennf0lio
    Have anyone tried to do a PPC campaign on services like, Web Design Web Development Graphic Design Logo Design Programming as an individual and not a company? I've seen companies advertise web services, one popular i've seen is PSD2HTML. But I was if it is also applicable to individuals trying to market it self. If its applicable to, can you give some example how it is being maximize. Thank You! edit: I'd like to know if PPC campaign is effective on services mentioned above as an individual and not a company.

    Read the article

  • Incorrect Meta information in Google

    - by Ashfame
    Google shows up incorrect meta info (title & description) in search engine results for an add domain and the information is of the domain which is the primary domain of the hosting account. I mentioned this fact because add-on domains are in a sub-directory of the primary domain. Any ideas what could be the reason? Check this Google search which shows the information of http://katherinegaudette.com/

    Read the article

  • Website Ad Management tools

    - by vishnu
    Our company has a plan of buying a large number of cheap sites online as a part of marketing our main product. Currently there are a huge number of ads in those websites which are to be replaced with ours. Like Google Adsense , Clickbank etc. Is there a free, open source tool available online to replace these ads, track and manage them. I would like to discuss the feasibility of purchasing large number of sites for SEO and marketing. How easy is it going to manage these website.

    Read the article

  • Rewriting a URL for tomcat through an ajp connection

    - by StudentKen
    I've tried several attempts to resolve this, but all have come up naught. Currently I have apache setup to forward all urls at and past the /portal/ tag to tomcat. Unfortunately, tomcat receives these requests through /portal/appName, a subdirectory in webapps rather than the webapps root directory where my wars are deployed. Is there a simple solution to this that I'm not seeing? I've been trying to use mod_rewrite to ^/portal/ $ / but that doesn't yield the expected results (perhaps I'm doing this wrong?).

    Read the article

  • Is this Anti-Scraping technique viable with Crawl-Delay?

    - by skibulk
    I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute. I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold. Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?

    Read the article

  • We have a 200% increase of "organic" search traffic - how to figure out which keyword is causing this?

    - by Robert Grezan
    So our Google Analytics are showing us that 200% increase of "organic" search traffic. Analytics are saying that search keyword is "(not provided)". We are wondering how to find out which keyword is causing this? We are monitoring all important keywords for our website. None of keyword is in first 5, so our "organic" serach traffic is modest. However, today we received 200% increase of "organic" search traffic but none of keywords we can think of moved a bit. We also did not change anything related to SEO. And what is interesting Google Webmaster shows no changes - ~2500 impressions and ~200 clicks. How to find out which "keyword" might be causing this spike?

    Read the article

  • New site not appearing in index after change of address, no feedback from google webmaster tools

    - by Duffy
    Our change of address seems to not be taking effect. Here's the story so far: We're a web company and our product is called The New Hive. Our site used to be at thenewhive.com, but we decided to switch to newhive.com (drop the "the", it's cleaner). So the timeline of what I've tried, starting on July 29th: used 301 redirects for all pages (e.g. thenewhive.com/tag/art = newhive.com/tag/art) At this point we noticed that we had disappeared from search results when searching "The New Hive", the front page used to be all links to our site plus a couple news articles about the company. So on August 5th I: verified new domain in webmaster tools (old domain was already verified) submitted a change of address request on August 5th with Webmaster Tools / Configuration / Change of Address Then after another week, on August 13th I did this: Went to Webmaster Tools / Health / Fetch as google fetched our homepage and a couple sub pages, all successfully clicked "Submit to Index" for homepage As of today (August 23rd) we're still not showing up in the index. We're getting no warnings or feedback of any kind from the dashboard so I'm inclined to think something's broken with the dashboard rather than that something's wrong with our site from an SEO perspective. From the dashboard: No new messages or recent critical issues. Crawl Errors: No data available. From Health - Index Status: Total indexed 0 Ever crawled 42,490 Not selected 12 Blocked by robots 0 I'm really at a loss here, any help would be appreciated.

    Read the article

  • If a blogger writes a whole article about my website, how important are anchor texts?

    - by Noam
    If there is a full article about my web-service, with my brand name in the title, and many relevant keywords that I would like Google to consider in my rankings, and links to my web-site with simple anchor text such as <brand name> and <page title>. Does it make a big difference if I get links to the actual keywords I'm after, or is it enough that these keywords are part of the written text?

    Read the article

  • is redirecting mydomain.eu & mydomain.net to mydomain.com using .htacess spammy?

    - by sam
    a client has asked my to develop their site, they already own 3 domains, mydomain.eu, .net and .com they want all the traffic from .eu and .net to redirect to .com, ive explained to them that it is not that relevant as people will search for them in search engines rather than typing in the domain, but they still would like me to do it. As far as i know this is fine to do from an Seo point of view but i thought id just double check ..

    Read the article

  • How do you save/export changes made in Firebug?

    - by blunders
    Using Firebug to edit CSS, how do I save/export changes made to the CSS? TOOLS: Firefox, Firebug MAJOR UPDATE: If you know of a way to lock the forward/back/refresh on a FireFox tab, please let me know. Otherwise, I've given up on using FireBug/FireDiff as an IDE for CSS, it's nice, but lol... press backspace at the wrong time and ALL your work is gone... funny. So, really like the browser highlighting to CSS/HTML in Firebug. Know any good CSS editors that do this? Really had hope FireBug would work, but for now only see it as being good for ad-hoc inspection and test; meaning using it for what it's made for. UPDATES: @Lèse majesté: Just as an update, "Web Developer add-on" does let you edit CSS, but it does not let you edit/save CSS changes made by Firebug. Meaning you use Firebug to ID and maybe test changes, but it does not let you save the changes from Firebug. Here's a "how to" covering how to use them together: FF + FB + WD @Lèse majesté: Still playing around with FireDiff. It works okay, found one bug already (although I'm just working around it), and there's no "how to" I've been able to find, so I'm just trying every feature and clicking around... (for example, to export a diff you must be over the last item in the list, right click, and select as "Save Diff". The ".diff" is just a text file, no idea why at this point the ext is .diff.

    Read the article

  • Off-site Cardholder Data Storage

    - by LinuxGnut
    Is there a service or site out there that will store cardholder data for me? I don't need any kind of transaction processing or recurring billing... I just need somewhere that I can store data on until someone in my company is able to look at it. The specific need is allowing customers to input data that will be used for credit checks. Name, Address, Credit Card(s), and the such. Google Checkout, PayPal, NetSuite, and Authorize.net seem to be what everyone suggests to me, but they don't offer what I need -- they're just payment gateways.

    Read the article

  • GUI question : representing large tree

    - by Peter
    I have a tree-like datastructure of some six levels deep, that I would like to represent on a single webpage (can be tabs, trees; ....) In each level both childnodes and content are possible. Presenting it like a real tree would be not very usable (too big). I was thinking in the lines of hiding parts of the tree when you drill down and presenting a breadcrumbs or the like to keep you informed as to where you are... I guess my question boils down to : any ideas / examples ? Tx!

    Read the article

  • My page no longer shows up in Google's results for a keyword

    - by user6456
    I have a small website about a commercial product, with a description and tutorial. 2 days ago it was in position 11th in Google search results, without any kind of SEO optimization on my part. Today it's gone. Totally gone - not even in the first 200 results. It's still very high in bing.com and duckduckgo.com. The site is very on topic. It's hosted under domain Keyword.com, and it's about commercial product which addresses the Keyword. How can I know what happened?

    Read the article

< Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >