Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 88/216 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • Is it legal to ask photo ID and credit card copy in the U.S

    - by selim
    I regularly order from online shops around the world and I have not see any case where the company asks for photo ID or credit card copy. Yesterday I make an order from linode.com and my order is on hold because of their "fraud check system". Is it common to ask those info in U.S. where I have never asked such info in here (Istanbul, Turkey). And I already asked what is their motive and legal stand and the reason my order is hold by their fraud system. And I also added whether is because I live in Istanbul, Turkey. Their answer was as following: "We would not be able to disclose specific information related to our fraud system." And I'm asked repeatedly whether I want to cancel my order or not. I dont questioning reputation of linode.com if I think so, I did not make an order. I think asking for photo ID is neither legal nor provide any security.

    Read the article

  • How to list pages in a category as bullet points in MediaWiki?

    - by Sandra Schlichting
    I'm using MediaWiki and I would like to list pages with a specific tag in alphabetical order. I.e. If the following 5 pages have the category "Backup" I would get a list similar to Printing Programming Remote Access Remote Sound Smartcard DynamicPageList with <DynamicPageList> category = Backup </DynamicPageList> gives me what I am looking for, but not in alphabetical order based on page title. Does anyone know how to do this?

    Read the article

  • Small, fast shopping cart setup

    - by R..
    I'm looking for an open source shopping cart solution that's simple and easy to setup. The requirements are: Quick setup for someone familiar with *nix webservers. Checkout via PayPal (other payment methods not needed). Customers should not have to create an account to make a purchase. At least a minimal level of inventory control. Ability to print/export a list of orders in compact form. Any recommendations for something I should try? Ability to get it up and working quickly is really my priority right now; if it's not ideal, it can be replaced (or, as I'm looking for open source, I can adapt it to fit the requirements better) at a later time. Edit: Really what I'm looking for is simplicity. This will be for a small local business, and the orders will consist of 1-10 items that are being delivered by a driver who needs a simple list of what each customer received when making delivery. Looking like a giant online computer/electronics/etc. store is definitely not a desirable quality. The simpler the interface presented to customers (who are used to purchasing through dumb web forms and paying COD), the better.

    Read the article

  • What is correct heading setup for subpages

    - by user1010609
    What is the best for seo of the following: using <h1>keyword</h1> in layout and putting each subpage title in </h2> using <h1>keyword</h1> only for main page and on each subpage replace it to <h2>keyword</h2> and using h1 tags for subapge title not using <h1>keyword</h1> on any of the pages instead put keyword in in header and use for each subpage and using <h1>keyword + something for main page title</h1> None of the above (please go into as much details)

    Read the article

  • your AdSense account poses a risk of generating invalid activity

    - by Karington
    i received a mail from the adsense team saying: I am not an adsense expert, im actually quite new to it. I spent a lot of time on my site http://www.media1.rs, its a news aggregator with tons of options. In the meantime i discovered the double click service that had a good option to turn on google ads when you don't have any other running so i joined up for google adsense with my company account. Everything went smooth until one day (21.Jul.2011) i got an email... Hello, After reviewing our records, we've determined that your AdSense account poses a risk of generating invalid activity. Because we have a responsibility to protect our AdWords advertisers from inflated costs due to invalid activity, we've found it necessary to disable your AdSense account. Your outstanding balance and Google's share of the revenue will both be fully refunded back to the affected advertisers. Please understand that we need to take such steps to maintain the effectiveness of Google's advertising system, particularly the advertiser-publisher relationship. We understand the inconvenience that this may cause you, and we thank you in advance for your understanding and cooperation. If you have any questions or concerns about the actions we've taken, how you can appeal this decision, or invalid activity in general, you can find more information by visiting http://www.google.com/adsense/support/bin/answer.py?answer=57153. Sincerely, The Google AdSense Team At first i didn't have any idea why... but then it came to me that it was maybe the auto refresh script we had because we publish news very very often and it would be useful for visitors... but i removed it immediately after i got the mail... Then i thought it might be my friends clicking thinking that that will help me (i didn't tell them to do it and don't know if they did) or something like that but than it couldn't be that because everyone can organize 10 people and get anyone who is a start-up banned? right? Anyway i filled out the form that was on the answers page with the previously removed script and got this from them: Hello, Thank you for your appeal. We appreciate the additional information you've provided, as well as your continued interest in the AdSense program. However, after thoroughly re-reviewing your account data and taking your feedback into consideration, our specialists have confirmed that we're unable to reinstate your AdSense account. As a reminder, if you have any questions or concerns about your account, the actions we've taken, or invalid activity in general, you can find more information by visiting http://www.google.com/adsense/support/bin/answer.py?answer=57153. I do understand them that they have to keep things secret in a way but i don't know what I'm supposed to do now? Is there a check list that i can go through and re-apply? Where do i re-apply on the same form? Please help as we are a small company and cant really have a budget for hiring a specialist + don't know any also... p.s. the current ads on the site are my own through doubleclick... Thanks in advance! Best, Karington

    Read the article

  • how to select categories for user generated content site?

    - by Frederik Creemers
    On the site I'm building, users can create tutorials. I want the users to be able to create tutorials on as many subjects as possible, but still have some preset categories. What's the best way to select these categories? The reason I don't just let users add keywords, and use these for categorization, is because users gain experience points in a certain subject when their tutorial is liked by someone, and in a similar way the Stack Exchange network does, create communities around these subjects. I will give visiters the possibility to suggest new categories. here are the categories that I'm thinking of at the moment: health gardening cooking technology science & math music visual art

    Read the article

  • Shared to Dedicated or Amazon CloudFront to improve performances and keep secured?

    - by user978548
    I have a Wordpress which currently takes about 1.8s to 2.5s for the home page to completely load in my country. The page weight is about 700Ko (static content included). In order to increase performances, I'm considering two solutions: Switching to a dedicated host. Using amazon s3 cloudfront to serve static contents. My current shared hosting have servers in a neighboring country but not exactly in mine, and both amazon and the dedicated hosting have some, so that's already an advantage. So considering all that, I still have three questions remaining: Currently having a low traffic (100 unique visitors/days, but growing) will it make a huge difference between my shared hosting and a dedicated server ? Knowing that I already use a cookie-less domain to deliver static contents (but using a redirection to the same server), would using amazon s3 make a real difference ? Talking about the cons of dedicated vs amazon s3, if I choose for the dedicated server something like Ubuntu server and do daily package updates and have only port 80 open, would it be sufficient in terms of security (in comparison with my current shared hosting which manage everything for me) ?

    Read the article

  • Redirect a URL to another URL with IIS 7.5

    - by Jason White
    I have no idea why this isn't working. I've tried creating map rules and then rewriting and redirecting the URL. I've tried just redirecting it with a simple rewrite rule and no matter what, the only time I can get it to work is if I set the match URL to match this regex .*. I'm trying to redirect webmail.example.com to mail.example.com. Seemed like it would have taken but a couple seconds; boy was I wrong. I'm thinking I must be doing something wrong with the regex, but I'm not sure what as when I test it it seems to work fine. <rule name="webmail" patternSyntax="ECMAScript" stopProcessing="true"> <match url=".*webmail.*" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> </conditions> <action type="Redirect" url="https://mail.example.com:8000" appendQueryString="false" logRewrittenUrl="true" /> </rule>

    Read the article

  • Changed url from non www to www...Google Indexing

    - by user20321
    I have recently changed (about 1 week ago) my url from non www version to www version. I told my hosting company to do this and they did it successfully all my urls are directed to www version. But google is indexing my non www version on the search results. I have updated new content on my website and google indexes that content with the changed url i.e with prefix www but the mainpage i.e the site name is still shown without www and its not updated. I have checked that my www.sitename.com is listed on google but not shown when I type www.sitename.com. So how much time does it take to remove the old urls from indexing and updating into new urls ??????

    Read the article

  • Reading 'Index Status' graph in Google Webmaster tools

    - by sam
    I recently found a bunch of old files that had been ftp'ed to a live production server by mistake on a static (html / css / js) site. I manually deleted these files, but today when checking in Google Webmaster tools i found this graph below. The 'update' marker is from 3/9/14, what i can work out is what Google is trying to tell me, are they saying that : There was a ranking update like Penguin or Panda and they penalized my site and un-indexed a load of pages which they thought were junk.. OR Is this showing that I updated the site by deleting the files on the server on 3/9/14 OR Is this something else ?

    Read the article

  • Making own clothes website [on hold]

    - by Manjushree
    I am BSc student in Mathematics but i would like to create own clothes website. Can anyone help me how can i design the clothes website. I never have any background knowledge about making the webpage online. The clothes website does not have to look professional but simple enough where i can put my clothes to show the items to people or customers. Once I created the clothes website then i can open the business account and starting selling the goods online with that account. Do i need to buy any domains to create the website? Please help me?

    Read the article

  • Facebook: Sending private messages to FB profile from a static website [migrated]

    - by Frondor
    I need to setup a static website for people to: Complete a form. And using anything from Facebook API, GET the form output via message to a Facebook Profile. I've been punching my head against "facebook developers" page all night long and can't find out how to do it. Seems quite easy, but the problem is that I don't know if you'll get my point :) Like the Send Dialog feature, you can set a certain user as recipient which will be displayed on the "To:" field once the dialog appears. FB.ui({ method: 'send', to: 'UserID', link: 'http://www.nytimes.com/2011/06/15/arts/people-argue-just-to-win-scholars-assert.html', }); Ok, All I need is to be able to use the same behavior but instead of setting a "to:" parameter, I'd like to set a "message:" parameter. I don't know how I can solve this becuase there's no parameter like this on the API actually. This is what I need to build (It's a prototype, this code won't work) <form action="mysite.com" id="order"> <input type="radio" name="chocolate" value="white">White <br/> <input type="radio" name="chocolate" value="black">Black <br/> <input type="submit" value="Order" /> </form> jQuery gets the values $(document).ready(function() { $("#order").on("submit", function(e) { e.preventDefault(); var formOutput = $(this).serialize(); var order = "I'd like to eat" + formOutput + "chocolate"; }); }); Facebook sdk sends this output ('order' string) FB.ui({ method: 'send', //or whatever to: 'UserID', message: order, //Its just an example, note the variable coming from the form link: 'http://www.nytimes.com/2011/06/15/arts/people-argue-just-to-win-scholars-assert.html', }); As we all know, what I wrote isn't possible, so I'm asking for any alternative solution if somebody can give me, I'm not very friendly with facebook APIs :) I though in another solution which consist in using the form output directly on the 'link:' parameter of FB.ui and then reading it with jQuery on some landing page. For example, on the message sent, the linked content redirects to this URL: http://mysite.com/dashboard.html?chocolate=white and the dashboard page source code: <script> var choco = getUrlParameter('chocolate'); $("#dashboard").text("This person wants" + choco + "chocolate") </script> <div id="dashboard"></div> And this way, I will be able to see which kind of chocolate the person selected by parsing some parameters on the URL when clicking on the link section of the message: using a code like this: FB.ui({ method: 'send', //or whatever to: 'MyUserID', link: 'http://mysite.com/dashboard.html?chocolate=white', }); But no this try, my biggest problem is that I don't know how to dynamically "customize" that "link:" paramenter with jQuery. I think the best solution is to use a code like this along with the dashboard page in order to "translate" the shared URLs and see what kind of chocolate people are demanding xD FB.ui({ //declaring a variable (example) var string = getFormData().serialize; var orderString = "mysite.com/dashboard.html?" + string; // end the variables // start facebook API code method: 'send', //or whatever to: 'MyUserID', link: orderString, }); I was working here until I gave up and started to post this http://jsfiddle.net/Frondor/sctepn06/2/ Thanks in advance, I'll love you for ever if you help me solving this :D

    Read the article

  • separate domains vs subdomains [duplicate]

    - by Sharon
    This question already has an answer here: Registering multiple domains vs. subdomains 5 answers We manufacture a very versatile product used in a wide variety of products using multiple brands. In order to market these brands, should we create a separate domain for each brand/market or use subdomains from our well established main domain? What would be best for SEO without breaking the bank?

    Read the article

  • What options are there for integrating with payment gateways?

    - by Rowland Shaw
    It seems that there are only two types of payment gateway service out there at the moment; Either that the entire cart logic is handled offsite (with something like Paypal's Standard option) or the other option being that you need to go through the certification for handling credit card numbers and doing pretty much everything yourself. Ideally, for the project I'm working on, I'm after a bit of middle ground such that I can handle the cart on-site, and only pass over to a payment gateway (with an order amount, billing & delivery details, and order ref) for them to handle the card details, before passing back. I'm sure that I've used e-commerce sites using this pattern before, but I cannot find any payment providers out there that offer this sort of option, so are there any? The only over requirement we have at present is that it must accept orders in Sterling.

    Read the article

  • How can I reduce the number of spammers registering with my phpBB site?

    - by Jayapal Chandran
    I have a site which runs phpBB, on this site I have enabled user authentication through email when registering enabled captcha However I still get spam users every 20 to 30 minutes. Is there anything I can do to prevent this with the ucp.php file? I have already loaded a large list of IP addresses yet there are spam users registering all the time. One thing I can do is I can check the bounce mail to find the username and can pipe bounced mails to a php script and immediately delete that user, but I have not got any bounce back from hotmail or some other email clients. So this way it will catch hold of a certain percent of spam users but there are still a huge amount of users spamming. What else can I do to prevent spammers abusing my phpBB site?

    Read the article

  • Does Webmaster Tools list traffic from ads as inbound links?

    - by Mohamad
    In Webmaster Tools, under the inbound links section, do ads get counted as inbound links? I am doing a review of inbound links on a website and found that most of them are sourced from meaningless blogs and spam websites. Before I accuse anyone of not doing their job properly, I would like to know something: Is it possible that those inbound links were generated when an ad for the website appeared on the spam website? An SEO firm was paid handsomly to generate inbound links and I am afraid all they did was submit material to spam blogs and websites.

    Read the article

  • Digg alternatives for blog and unpopular users? [closed]

    - by Wladimir Ivanov
    all. I'm struggling to build an [B]audience[/B] for [B]electronic blog[/B] . The blog is relatively new and has around 30 pages. The unique visitors I get are approximately [B]120 - daily[/B]. I know about directories, rss, comments and guest blogging, but is there other more effective strategy to build some quality audience? As I see nowadays there aren't enough materials in my country about this. What about digg and reddit? Everytime I post some link there: no traffic comes to me. Can you suggest me other digg/reddit/stumbleupon tactics to get followers or there are similar sites which would tend to give me some serious traffic. Can you suggest sites appropriate for linking to music blog? Best regards.

    Read the article

  • Sharing banners on 3rd party websites, concerned about limited resources on on server side

    - by Omne
    I've made a banner for my website and I'm planning to ask my followers to share it on their website to help improve my rank. my website is hosted on GAE, the banners are less than 5kb/each and I must say that I don't want to pay for extra bandwidth I've read the Google App Engine Quotas but honestly I don't understand anything of it. Would you please tell me which table/data in this page should be of my concern? Also, do you think it's wise to host such banners, that are going to end up on 3rd party websites, on the GAE? or am I more secure if I use free online services like Google Picasa?

    Read the article

  • Does google contribute ranking from cdn.example.com to example.com?

    - by DesignerGuy
    Background From my understanding, http://mywebsite.com/image.jpg, can help the ranking of http://mywebsite.com in a search engine, such as Google (obviously the search engine of primary concern). So, SEO-wise, moving an image to http://whatever-cdn.com/my-account/image.jpg is bad. A popular solution is to use a CNAME record, such as http://cdn.mywebsite.com, so that image.jpg can be accessed at http://cdn.mywebsite.com/image.jpg. The question Does http://cdn.mywebsite.com/image.jpg rank as effectively as http://mywebsite.com/image.jpg ? Does it help boost the main http://mywebsite.com ? Or, does it rank independently because it is a subdomain? Is there another option (a way to use a CDN without sacrificing ranking)?

    Read the article

  • Dealing with blackhat SEO companies and low quality link building competitors [closed]

    - by Mikko Ohtamaa
    I have often faced a case where the competitors of my client use SEO blackhat tactics where they contact a SEO company to do link building for their websites and products. Here is an example of a typical case of a fake blog created only for link building purposes A very low content article http://marshallfab.com/fundus-camera-explained.html in obvious fake blog: no author information, partially machine generated text, all blog posts are solely about link building Following the link you get to the promoted company page http://www.patternless.com/ ... which, unsurprisingly, links the SEO company homepage in the footer text http://www.affordableseofl.com/ ... who are not shy to advertise their Extremely aggressive SEO plan Does Google have any feedback channel where one could submit cases like this, so that Google would punish the link builders? Are there any means to bring these blackhat companies to pushame to damage their reputation?

    Read the article

  • Breadcrumb for multiple categories

    - by Damodar Bashyal
    I post in multiple categories, so is it better to have: Consulting Services Implementation Service A Consulting Services Optimization Service A Consulting Services Upgrade Service A or, Consulting Services Implementation, Optimization, Upgrade Service A I was doing second way, the problem is google doesn't show 3rd set of crumbs. ie it only displays: Consulting Services on search result. But having multiple breadcrumbs on the page doesn't look good. any suugestions? Update For @PatomaS 's question I mean 3 lines of breadcrumbs, see above i have posted same article (Service A) in 3 categories (Implementation, Optimization, Upgrade). So you can reach same article through 3 categories. So whats the best breadcrumb to display on article 'Service A'?

    Read the article

  • SEO: Single URL rewrite from one app to another

    - by user1909186
    I have two web applications running on two different servers. I want one, example.com/hello, to redirect to the second, hello.com. But I want both to contribute to each other's SEO ranking. What is the best way to accomplish this primarily for google search and for other search engines? I currently do a rewrite with permanent from example.com/hello to hello.com using nginx. Thanks for your help

    Read the article

  • how to retrieve img alt text with jquery or javascript? [on hold]

    - by kate
    Which is the code with which we can retreive alternative text of image: It is a Cataloge with clothes. Dressers, Shirts, Skirts e.t.c. in front page of a site. The featured images of the categories can be changed manually from someone. I did a check and it is asking me to give alt text. I did it to some images with alt="". But to the cataloge I cannot do it. the code is below: {{ 'option_selection.js' | shopify_asset_url | script_tag }} {{ 'api.jquery.js' | shopify_asset_url | script_tag }} {% if template contains 'customers' %} {{ 'shopify_common.js' | shopify_asset_url | script_tag }} {{ 'customer_area.js' | shopify_asset_url | script_tag }} {% endif %} {% if settings.display_slideshow %}{{ 'jquery.slider.js' | asset_url | script_tag }}{% endif %} {% if settings.include_masonry %}{{ 'jquery.masonry.js' | asset_url | script_tag }}{% endif %} {% if settings.enable_product_image_zoom %}{{ 'jquery.zoom.js' | asset_url | script_tag }}{% endif %} {{ 'fancy.js' | asset_url | script_tag }} {{ 'shop.js' | asset_url | script_tag }} Shopify.money_format = '{{ shop.money_format }}'; {% if template contains "product" %} jQuery(document).ready(function($){ {% if product.variants.size 1 or product.options.size 1 %} new Shopify.OptionSelectors("product-select", { product: {{ product | json }}, onVariantSelected: selectCallback }); {% assign found_one_in_stock = false %} {% for variant in product.variants %} {% if variant.available and found_one_in_stock == false %} {% assign found_one_in_stock = true %} {% for option in product.options %} $('#product-select-option-' + {{ forloop.index0 }}).val({{ variant.options[forloop.index0] | json }}).trigger('change'); {% endfor %} {% endif %} {% endfor %} {% endif %} }); $(function() { $( "#tabs" ).tabs(); });

    Read the article

  • Does Google sometime prevent new white hat sites from ranking at all in some verticals?

    - by JVerstry
    Assuming someone wants to implement a new viagra or akai berry e-commerce website. There is a lot of competition and this site does not really bring something new, other than a new online counter to buy products at a nice price. Assuming this site does not use any black hat techniques and that it stays with Google quality guidelines, and assuming it has no (or few) backlinks (from non-authoritative websites). Assuming this website's pages are indexed properly in Webmaster Tool, and that no penalties are reported. No site improvements are suggested. Google crawls the site daily as reported in GWT. No robots.txt configuration issues. Does Google sometime decide to no rank this site for any user query (for weeks), because of lack of original content? The reason I am asking this is that I am trying to understand the possible cause of a similar situation I am observing with two sites. If so, what is the way out to start ranking for these site? If not, does it mean the cause is elsewhere for sure? Any confirmed info to get out of the maze is welcome.

    Read the article

  • Understanding the maximum hit-rate supported by a web-server

    - by SNag
    I would like to crawl a publicly available site (and one that's legal to crawl) for a personal project. From a brief trial of the crawler, I gathered that my program hits the server with a new HTTPRequest 8 times in a second. At this rate, as per my estimate, to obtain the full set of data I need about 60 full days of crawling. While the site is legal to crawl, I understand it can still be unethical to crawl at a rate that causes inconvenience to the regular traffic on the site. What I'd like to understand here is -- how high is 8 hits per second to the server I'm crawling? Could I possibly do 4 times that (by running 4 instances of my crawler in parallel) to bring the total effort down to just 15 days instead of 60? How do you find the maximum hit-rate a web-server supports? What would be the theoretical (and ethical) upper-limit for the crawl-rate so as to not adversely affect the server's routine traffic?

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >