Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 237/389 | < Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >

  • best way to import 500MB csv file into mysql database?

    - by mars
    I have a 500MB csv file that needs to be imported into my mysql database. I've made a php file where i can upload the csv file and it analyses the fields n stuff and does the actual importing. but it can only handle small fiels max 5mb. so that's a 100 files and actually pretty slow(uploading) is there another way? I have to repeat this process every month because the data in the file changes every month it's about 12 000 000 lines :D

    Read the article

  • XML Parsing Error at 1:1544. Error 4: not well-formed (invalid token)

    - by Steve
    I have installed Joomla 1.5.22 on a new hosting account, which doesn't have a domain yet, so it's public URL is http://cp-013.micron21.com/~annimac/ A message saying: XML Parsing Error at 1:1544. Error 4: not well-formed (invalid token). The source code for this message is: <dl id="system-message"> <dt class="error">Error</dt> <dd class="error message fade"> <ul> <li>XML Parsing Error at 1:1544. Error 4: not well-formed (invalid token)</li> </ul> </dd> </dl> There is nothing in /logs to indicate what the problem is. I have uploaded the following folders from a freshly unzipped copy of Joomla 1.5.22: administrator components includes language\en-GB libraries modules plugins templates\ja_purity xmlrpc and the issue remains. I have no custom or additional plugins, modules, or components installed. If I change templates, the problem remains. What is the problem?

    Read the article

  • Is there a way to setup Clicktale tag in Google Tag Manager?

    - by Cubius
    Since GTM doesn't support document.write() method the standard clicktale code doesn't work. Is there a workaround for this? ClickTale employee has sent me these instructions: Replace the document.write JS line above with the following: document.body.appendChild(externalScript); Example: <!-- ClickTale Bottom part --> <script type='text/javascript'> var externalScript = document.createElement('script'); var scrSrc = document.location.protocol=='https:'? 'https://clicktalecdn.sslcs.cdngc.net/': 'http://cdn.clicktale.net/'; scrSrc += 'www11/ptc/xxx-xxx-xxx-xxx.js'; externalScript.src = scrSrc; externalScript.type = 'text/javascript'; document.body.appendChild(externalScript); </script> <!-- ClickTale end of Bottom part --> I am not sure what to do with this. Has someone tried something like this?

    Read the article

  • Somehow Google considers a properly 301'd URL as 200 and is still indexing the new content in old page?

    - by user2178914
    We redirected all the old URL's to new ones properly using htaccess. The problem is Google, somehow is still finding content in the old page(which it shouldn't) and stores it in the cache rather than the new URL. For eg: Old Page- http://www.natures-energies.com/iching.htm New Page- http://www.natures-energies.com/index.php?option=com_content&view=article&id=760 If you type the old URL into the browser it redirects If you fetch the old URL as Googlebot in the webmaster tools the header says 301/permanently redirected. If I try to crawl as any other bot it still says 301 redirected. Even if you click the old link in Google it redirects to the new URL. Only in its cache it shows the old URL and moreover it shows the new content in it! I am stumped on how Google manages to grab the new content and puts in the old URL instead of the new one! One more interesting thing is that if I try a cache for the new page it shows the cache of the new content with old URL! Any help would be appreciated. I am at end of my wits. I think i have tried almost everything. Is there anything that I'm missing to see? You can use this search to find the old url's. Maybe you'll some patterns that i missed. site:www.natures-energies.com inurl:htm -inurl:https|index

    Read the article

  • Storing data on server [closed]

    - by Maciekp
    1.How am I supposed to store data on server, using not only: databases,text files and images? And how someone could implement storing data in fb's graph api http://developers.facebook.com/docs/reference/api/ , so when I go to: https://graph.facebook.com/19292868552 it shows me such data(how it can be stored? I guess it's not Mysql database) PS. Link to article: http://jayant7k.blogspot.com/2009/05/how-facebook-stores-billions-of-photos.html <- How can concurrent users writing requests be solved(while storing data in text file).

    Read the article

  • A record not resolving

    - by user1561108
    I have a hosted domain at siteground. On this domain & host I have a subdomain with a wordpress install. I wish to move this blog to another host (hostgator), while keeping the domain with siteground. To do this I created a hosting account at hostgator, got it's ip address and set the A record in siteground's cpanel accordingly: subdomain.example.com 14400 A (ip of hostgator account) Going by this online traceroute tool the records appear to have been updated (over 4 hours ago now) as it now resolves to a theplanet.com server location which hostgator use yet the subdomain is still not resolving from a web browser. The account at hostgator has been setup and is navigable via ip address/~accountname. What's going wrong here? I should add the relevant DNS record at hostgator side looks like this: subdomain.example.com 86400 IN SOA ns483.websitewelcome.com. subdomain.example.com 86400 IN NS ns1.siteground145.com. subdomain.example.com 86400 IN NS ns2.siteground145.com. subdomain.example.com 14400 IN A 74.54.176.3 I'm not sure if the hostgator record should be classed as the SOA record but I don't know enough about it to be sure. Is this the source of the problem?

    Read the article

  • Is it possible to get free web host for my registered domain? [closed]

    - by Ahmed Alsayadi
    Possible Duplicate: How to find web hosting that meets my requirements? I searched online for many free web hosting websites like NetFirms, most of them asking you to register for their sub-domain or to buy a new domain, but I already have one which I bought through GoDaddy. Now I hope to find a free web host for my website (site's size less than 20 MB). Any idea which web hosting can meet such requirements?

    Read the article

  • Customer escalated to a claim without sending the item back? [closed]

    - by kavoir.com
    She claims that she has sent the item back but for over 1 month I haven't received it. I don't know where I can find the tracking number so I don't know if she really sent it or not. Now she has escalated the dispute to claim and PayPal is asking me for information. The reason is "Not as described". So how do I respond to this? I mean, in the dispute, we agreed that I'll issue a full refund as soon as I receive the package she return to us but we never received this package that she claimed to have sent back. Now she's escalating this to a claim and PayPal is asking me for documents. How can I provide any documents that would prove she hasn't sent the package? Thanks!

    Read the article

  • Why are new pages not being indexed and old pages stay in the index?

    - by ZakGottlieb
    I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?

    Read the article

  • Difference between Global and Local SEO

    - by user29660
    I have been reading up on SEO techniques in an effort to learn how to do it thoroughly so I can charge my client for the service. To guage my price I have checked out competitor prices and noticed that theres a fair price difference when it comes to guarenteeing a page 1 ranking with global keywords compared to local keywords. So what is the difference in terms of work load and techniques used to justify this price difference? just to clarify, i am looking for technical differences in programming , methodology etc.

    Read the article

  • Defining pages by custom variable, reporting to Google Analytics

    - by zwolfson
    Does anyone know how I could add some information to my page (whether in a meta tag or some other way) to indicate to Google Analytics that a page is targeted to a certain segment? This is pretty straightforward on my main site where everything is set up in hierarchy by target user so I can just use the URL to filter the data. However for my subdomains, its a flat structure that's just a bunch of landing and conversion pages so I can't use the URL. Any help would be much appreciated. Thanks

    Read the article

  • IP blocked because: (smtpauth) Failed SMTP AUTH login from - can someone explain?

    - by gdaniel
    Today I had a few users blocked in our server firewall because of: (smtpauth) Failed SMTP AUTH login from Can someone explain the reason? What does it exactly mean? Could someone be using the our website to access SMTP for spamming purpose? UPDATE: Server info: Centos OS with CPanel and WHM. However no one has access to either. Taking a look at the logs it looks like someone repetitively attempted to login with a known existent user/pass.

    Read the article

  • AJAX spreadsheet editor interfaced to own website

    - by Ole Tange
    My website has records that are tables. I would like for my users to be able to edit these records in an easy way. Currently they download a .csv-file, edit it in their favourite spreadsheet, and upload it again. But this often fails (they upload in wrong format or edit fields that they are not supposed to touch). I would therefore like to present the users with an editor directly on the website. So just like you can have WYSIWYG editors in CMSes for text I would like to have it for spreadsheets. One solution would be to interface my website with GoogleDoc and have the users edit the files using GoogleDoc's spreadsheet, and some how get the sheet back when they are done, but I do not know if this is possible at all.

    Read the article

  • How to know an article is copy from other site? [on hold]

    - by cj333
    We have a site. our main servers is for blogs and freedom write. Our custom can submit their article in our site. But for ptotect the copywrite, we do not allow custom just do easy copy and parse a whole article from other site. At least they should write something by themselves. And how to check if the article is a completed copy from other site? google webmasters? google search api? or other better way? Because we always received DMCA notice by google, so I am tring to stopping the articles before they were posted. Thanks.

    Read the article

  • Challenge a .name registration?

    - by Shtééf
    The Wikipedia page on .name says the following: Registration restrictions: No prior restriction on registration, but registrations can be challenged if not by or on behalf of individual with name similar to that of domain, or fictional character in which registrant has rights But there's no further info on how this actually works. Can a .name domain registration be challenged, and if so, how?

    Read the article

  • Looking for open-source solutions for a knowledge sharing website

    - by Bundarr
    There is a need in my workplace to setup a knowledge sharing site, a place where users can discuss projects they are working share documentation ask questions I am looking for a open-source system that answers these needs, and that can be setup in a week, and requires only PHP and MySQL. I am a Wordpress fan and developer and could easily implement such a system in Wordpress, however this system needs to be very simple to use for the technically challenged. Without customization, Wordpress users would still need to login to the "back-end" to post. I like the Stackexchange (OSQA) format, but these do not allow for file uploading out of the box. I do not have experience with Buddy Press, would this be an alternative?

    Read the article

  • Alternatives to Google Sites for a personal web site?

    - by Oleg2718281828
    I'd like to find a hosting solution for my personal, fairly low-traffic, web site. I think Google Sites would suit me perfectly, but I've heard horror stories about Google's algorithms shutting down all your accounts on a whim, with no right of appeal, so I'm naturally worried, because I don't want to lose my Gmail. Edit I'm talking about cases like this one: I tried contacting somebody at Google support ("Surely they should have a support department, right?" Nope, wrong!) (The victim managed to regain access to his Gmail account when his case went "viral", but he never got an explanation as to why he had been locked out in the first place) Are there good, preferably free, alternatives to Google Sites and what are their PROs and CONs? One requirement is that I should be able to point the DNS (foobar.com) to it.

    Read the article

  • Google Scholar Realted Question

    - by Art
    I have just requested Google Scholar to use my web site for collecting papers from my personal web site: http://cs.uic.edu/~asmirnov/publications.html I was wondering if I did everything right: I submitted a request on the form provided on scholar web site I published the papers in PDF on my web site Is there anything else needed for Google to index my web site? Other questions are: 1. The first paper (link to it) is not to just paper, but to the whole issue. 2. Are there any tages to be added on my web site, if so, then which and how do I add them? 3. What are those exporting options available on google scholar web site and how do they work? Thank you very much for being patient with me and my questions as well.

    Read the article

  • How would I migrate a front page form based site to an IIS7 server? [on hold]

    - by Nick Gilpin
    I have an large html website on a very old server and I'm attempting to move it to a brand new server running IIS7. I've tried moving the entire site to the WWW directory, but then all my forms return "The HTTP verb POST used to access path is not allowed". Additionally, all the solutions I've found online only work on a page by page basis. Is there a simpler/faster way to migrate a large website containing many forms to IIS7? Here is the specific error message: The HTTP verb POST used to access path '/FormServer/Mig/_vti_bin/shtml.dll/admissions/askseaaggie.htm' is not allowed.

    Read the article

  • How to offer a cookie opt in/out to users?

    - by Darkcat Studios
    I intend to use google analyticts, and as I understand it I will need to offer users the option to opt out of cookies. The question is this: I HATE these constant cookie option boxes, everyone I ask it getting annoyed by them too. Its nice to have the option, but we all know they have been in use for well over a decade. So - how big of a deal do I have to make about the fact that I'm using GoogleAn? can I pop a small link at the bottom of the page, maybe integrate it into "Privacy policies" page, and give people the option to opt out there? This would be very much the "Assume the majority of users don't mind, but at least make the option available" stance. Ironically setting a cookie seems to be the only way I can see to enforce the opt-out! as IP's change.

    Read the article

  • Blank lines between sourcecode [closed]

    - by manix
    I'm so confused with a strange behaviour. Actually I have edited some php files remotely with my PhpDesigner8 (a php editor). Everything goes right, but when my teammates reopen the files that I have edited the source code have blank lines like below: class AdminController extends Controller { function __construct() { parent::__construct(); if (!$this->session->can_admin()) { show_error('Solo para administradores.'); } $this->load->library('backend'); } } Instead of class AdminController extends Controller { function __construct() { parent::__construct(); if (!$this->session->can_admin()) { show_error('Solo para administradores.'); } $this->load->library('backend'); } } Did you have experience these kinds of problems?

    Read the article

  • Reliable method for google analytics tracking for print advertising campaign?

    - by chrisjlee
    A client is looking to track advertising clicks through a newspaper ad to measure success. They have rigid business requirements that it will be a unique domain... e.g. foowidgetsnews.net instead of foodwidgets.com/contact-form-page.php What is the most reliable method of building redirected url to a landing page so it will be tracked in google analytics as a direct hit from the newspaper? Finally, we would like to track the foowidgetsnews.net as the main url in google analytics because 301 redirect isn't tracked in google analytics like the way we would like it to.

    Read the article

  • How to edit thousands of html pages at once? [on hold]

    - by Johnsy Omniscient
    I need to edit thousands of pages for a website with dynamic content added manually by the owner throughout 3 years, it has thousands of pages and I'm sure there is a better way to edit them without spending hours opening each one of them. I know it would be easy to just edit the styles.css but page dynamics like the positions of the google ad-boxes are individually edited inside the html of each page, so there is no way to solve this through css. Is there some sort of code, script and macro that can edit the pages at once?

    Read the article

  • Cloud proxying service

    - by ChristopherJ
    I have an app that mashes up images from Bing image search, it's hosted on Heroku written in rails. The app is client side in javascript, so the mashup is done on an html5 canvas - this means though that if I fetch the images direct from the Bing server, the canvas gets dirty and I can't save it. As a quick work around, i have set up a route on my rails app that simply proxies the request to Bing and passes the result back through. Obviously this is a very poor performance solution and will eat up my dynos very quickly. Can anyone suggest a more suitable option? At the moment I'm thinking maybe Amazon EC2 with apache mod_rewrite rules would be better performing and more cost effective. Is there a cloud service (or an app I could deploy to a cloud service) that would be more appropriate for proxying requests for me so that my javascript can fetch the images without dirtying the canvas?

    Read the article

  • How do I create a background image on web page?

    - by kasha
    I am a new designer so hopefully this question isn't too basic! How do I create a background image on a webpage for a programmer? I designed the page in photoshop and I would like to know how to send the background image (the 25% opacity buildings overlay). I would be happy to send the main image (but it is too large and I imagine would slow the site and loading time drastically). here is the link to the design... http://problemio.com/home_page_1_1.pdf

    Read the article

< Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >