Search Results

Search found 60978 results on 2440 pages for 'web development'.

Page 330/2440 | < Previous Page | 326 327 328 329 330 331 332 333 334 335 336 337  | Next Page >

  • Logic behind crawling an webpages like that of Screaming Frog? [on hold]

    - by sree
    I would like to know what is the parameters to be considered while developing a crawler like that of Screaming Frog. Am looking forward for information on do's and dont's of webpage crawling. What are the problems the crawler may infuse on the webpages like loadtime (maybe?) or anything that effects webpage during crawling. What are the rules the crawler needs to follow etc. Basically anything info that makes the crawler look good and accurate. Just point me in a right direction to achieve it.. Hope my requirement is clear this time.. :)

    Read the article

  • Disqus thread migration. Gotchas?

    - by sramsay
    I've been migrating a site to a new domain. The site itself is pretty straightforward (it uses Jekyll), and everything has gone fine -- except migration of Disqus threads. I've had partial success -- some of the threads have migrated successfully, but not all. I've tried the domain migration wizard (which caught a few), the URL mapper (which caught a few), and the 301 redirect crawler (which caught a few). But the remaining threads just won't move, no matter which method I use. So, I suppose I suppose I'm asking if there are any "gotchas" I should know about with this. When you execute any of these migration tools, it says it will "take awhile." Does that mean hours? Days? I can't tell if it's working, and there's no logging or error reporting that I can see.

    Read the article

  • Would it be possible to build a client portal on Squarespace6?

    - by aBathologist
    I'm helping a family member set up a site which will need to include a secure client portal, providing access to documents and a simple database. I have been encouraging them to go with a more established, open source CSM like drupal or joomla, whose capability in this area is evident. However, they have a strong preference for Squarespace. Does any one know if it would be possible to accomplish this with the new developer platform for squarespace 6? I've spent well over an hour searching google, the squarespace site and stackexchange, but can't seem to find any clear answer to this question. I'm grateful for any insight you all can provide.

    Read the article

  • Include in service layer all the application's functions or only the reusable ones?

    - by BornToCode
    Background: I need to build a main application with some operations (CRUD and more) (-in winforms), I need to make another application which will re-use some of the functions of the main application (-in webforms). I understood that using service layer is the best approach here. If I understood correctly the service should be calling the function on the BL layer (correct me if I'm wrong) The dilemma: In my main winform UI - should I call the functions from the BL, or from the service? (please explain why) Should I create a service for every single function on the BL even if I need some of the functions only in one UI? for example - should I create services for all the CRUD operations, even though I need to re-use only update operation in the webform? YOUR HELP IS MUCH APPRECIATED

    Read the article

  • What is the best way to auto failover to backup WAN link for web server

    - by user66735
    Hi Iam looking for the best way to ensure my server ( application ) remains available for all my users (on web/LAN/WAN ), when my primary ISP link fails. My server is behind a firewall on which both my primary & secondary links land. I have already assigned multiple IPs (both ISP's static IP) to the 'A' record ( host.example.com ) in the DNS. However in a round robin scenario is there a way I can ensure that my web user will not see a "cannot dislay web page" error ever ?? What are the better methods to achieve this??

    Read the article

  • Hosting a magnet link site which could possibly infringe copyrighted material?

    - by Griff
    I have for the last 3 months built a crawler, indexer and alot of other things for what started out to be a home project for indexing magnet links on the internet. As my project grew I have thought about releasing my collected data (which at the minute is on a public domain but with no access) to the public. Whatever the crawler sucks in goes in, and whatever the indexer decides to index gets indexed as it is a fully automated process. My question is as follows; Considering that most of the data that is collected from what I have built points to illegal copyrighted material (as most magnet links do) where would it be best to host such a site. I notice all of the already public torrent sites are hosted in India is this because there laws are less strict on copyright infringement? Have any of you hosted such a site, and if so what problems have you ran into? And as always any advice on being a webmaster for this type website?

    Read the article

  • Chromium web-app creation doesn't work in 12.10?

    - by speter
    If you create an application of a website in Chromium and choose "desktop", a .desktop file is created. In 12.04, you were able to move it to ~/,local/share/applications/ and then start it from the Dash or the launcher. In my fresh 12.10 installation, this doesn't work anymore. I think the line that is misinterpreted is Exec=/usr/bin/chromium-browser --app=http://buymeapie.com/. If you try to run it from a terminal, you get the following result: 20:32 ~ speter Exec=/usr/bin/chromium-browser --app=http://buymeapie.com/ bash: --app=http://buymeapie.com/: Datei oder Verzeichnis nicht gefunden (English: "File or directory not found"). Can anyone explain or knows a workaround?

    Read the article

  • Finding out the shared hosting providers located in a particular data center

    - by unixman83
    I know the physical location of data-centers that I want my website hosted in. One of these is located on 350 E Cermack in Chicago, IL. My problem is that I am looking for all the providers of low-cost shared hosting in this data center. Do you have a list? And if you do have such a list can you please tell me how you came up with it? I know many discount hosting providers are physically located in the Arizona-Utah areas. But I am located near Chicago.

    Read the article

  • WHM local/external mail server confusion

    - by BWRic
    We host several websites on the same server using WHM but this seems to confuse the mail routing when someone has their own external mail servers - it looks locally. We have our own email accounts hosted on the server. When creating an account for a client on the same server WHM adds the default entries to the DNS for that account. However this client has their own mail servers elsewhere. But when sending them an email it never reaches that external server - it just sees the local, incorrect one. I realise I can update my DNS to point to the external server, but this means I am copying their settings and if they are changed, then I will also need to update mine. Are there some settings I can use to force it to use the external servers without having to copy the settings.

    Read the article

  • Web master tools is throwing out 404 errors on link not on page

    - by plantify
    Webmaster tools is showing thousands of 404 errors, where pages on the site are referring to another incorrect url. For example, URL not found www.plantify.co.uk/shop/=, linked from http://www.plantify.co.uk/shop/gift-voucher and http://www.plantify.co.uk/shop/special-plant-offers. I obviously have checked the source and cannot find any references to this link on any page. The only consistent issue is that it only seems to report this error on pages with two section i.e. www.plantify.co.uk/shop does not report any error whilst all pages with www.plantify.co.uk/shop/xxx (where xxx can be several different pages such as gift-voucher) all report this. I cannot seem to duplicate this error. I have run a link checker (we use Screaming Frog) and it does not report this error. I have fetched these pages as a bot, and these do not report this error. I am at a total loss. I cannot even duplicate the issue, but it is most definitely an issue, as Webmaster Tools is reporting new errors every day. Is this perhaps google bot doing its own thing?

    Read the article

  • Emulate Historical Figures i.e. Einstein - Is this possible using linguistic logic for my http://www.ustimeline.com Education System

    - by Johnnylight
    After hearing about the success of IBM's Watson I started thinking perhaps emulating human language is now possible? My goal is to create Virtual Historical characters to represent the main characters in my Adventur-Cation The Great American Adventure program such as Einstein or Crazy Horse. The goal is to build an intelligent system capable of indexing the internet and storing the data using a schema using modern knowledge on linguistic theory (phonemes, morphemes, syntax) to build a system capable to returning a semantically sound response very similar to the response made by the same person if still alive today. The goal would be to use the same engine/system for all characters. Each characters would have their own digital representation and voice, and would organize data differently based on tags/keywords stored about the individual. Imagine a Max Headroom Einstein. Based on the success of Watson, I believe something like this may now be possible. Would be an interesting way to study history and would be a vehicle of entertainment as well. Can anyone confirm if this has already been attempted? Is anyone interested in exploring this using Cognitive Science, Psychology, Artificial Intelligence, Historical data captured on the internet, and Linguistic theory?

    Read the article

  • IIS isn't propagating domain

    - by ErocM
    I called Godaddy and 'verified' my settings for the two ips were correct. ns1.asezo.com = xx.xx.xx.15 ns2.asezo.com = xx.xx.xx.16 then I set the nameserver of asezo.com to the ns1/ns2 above, which Godaddy tech support says is right. When I try to visit my site, it says Oops! Google Chrome could not find asezo.com. When I try to ping the website, it gives me a time out. I have the bindings in IIS for the default website as: http - xx.xx.xx.15 - 80 www.asezo.com and http - xx.xx.xx.15 - 80 asezo.com And I'm still getting nothing. I can go directly to the ip http://xx.xx.xx.15/ and it gives me the IIS default website, I just can't get the url to propagate. What am I doing wrong?

    Read the article

  • Change A-Record in Hetzner Konsoleh

    - by Charles
    I am a Hetzner level 19 shared hosting client and would like to change the A-record for my domain. When I log in to konsoleh, I see which IP-address it points to at the moment, but not where I can change it. Does anyone know how to do it? Thanks! Solution I called Hetzner support. It was because I had to log in with my client id, not with my domain. Then I could just click on the A-record and edit it.

    Read the article

  • Redirect Google crawler to different robots.txt via .htaccess

    - by user3474818
    I have googled for the answer all day and still couldn't find an answer. I have a virtual subdomain www.static.example.com which is a mirror site of www.example.com. It means I have just one root folder for subdomain and domain aswell. I want to redirect crawlers to different robots.txt file - robots_static.txt when they see .static in url in which I will forbid indexing via /disallow command. I want to do this because I have duplicated content in Google search results. Subdomain is showing the exact same content as the main domain. Does anyone know how could I achieve that crawlers sees robots_static.txt instead of robots.txt? What I have managed to find so far is this: RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] but when I check in webmaster tools, it still sees robots.txt as my robots file instead of robots_static.txt, so it crawls and index everything twice. What did I do wrong? Thanks EDIT: This is my .htaccess file ## # @package Joomla # @copyright Copyright (C) 2005 - 2013 Open Source Matters. All rights reserved. # @license GNU General Public License version 2 or later; see LICENSE.txt ## ## # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. ## ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks ## Mod_rewrite in use. RewriteEngine On RewriteEngine On RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] ## Begin - Rewrite rules to block out some common exploits. # If you experience problems on your site block out the operations listed below # This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR] # Block out any script that includes a <script> tag in URL. RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL. RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL. RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Return 403 Forbidden header and show the content of the root homepage RewriteRule .* index.php [F] # ## End - Rewrite rules to block out some common exploits. ## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ## End - Custom redirects ## # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root). ## # RewriteBase / RewriteCond %{THE_REQUEST} ^GET.*index\.php [NC] RewriteCond %{THE_REQUEST} !/system/.* RewriteRule (.*?)index\.php/*(.*) /$1$2 [R=301,L] RewriteCond %{THE_REQUEST} ^GET ## Begin - Joomla! core SEF Section. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for something within the component folder, # or for the site root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ## End - Joomla! core SEF Section. <FilesMatch "\.(ico|pdf|flv|jpg|ttf|jpg|jpeg|png|gif|js|css|swf)$"> Header set Expires "Wed, 15 Apr 2020 20:00:00 GMT" Header set Cache-Control "public" </FilesMatch> <ifModule mod_headers.c> Header set Connection keep-alive </ifModule> ########## Begin - Remove Etags # FileETag none # ########## End - Remove Etags

    Read the article

  • Could JQuery and similar tools be built into the browser install?

    - by Ozz
    After reading another question about JQuery and CDN's, is it feasible for tools like JQuery to "come with" the browser, thus reducing/eliminating the need for the first download from a CDN, or from your own host server. JQuery files specifically are pretty small, so you could easily have a number (all?) of the different versions as part of a browser install. Now fair enough, this would increase install footprint, download time for the browser itself. Then sites could check "local" first, before CDN (which then caches), before finally defaulting to downloading from the website server itself. If this is feasible, has it been done, and if not, why hasn't it be done?

    Read the article

  • Domain name at different host to website

    - by Corbula
    I have someone i'm making a website for with a domain name and current website hosted at fasthosts. I've built them a website hosted at a different host, unlimitedwebhosting. The website i've made them is in a directory like this. www.mysite.com/dev/0002 So fasthosts: Is the registrar for the domain name, it also has all of the email addresses and their current site. unlimitedwebhosting: Has the new site in a sub directory, like .com/dev/0002 Is it possible to keep the domain name and email addresses all hosted at fasthosts and to have the new website hosted in my unlimitedwebhosting account and to somehow have the domain point to the new website?

    Read the article

  • How to Evaluate the Best Web Builder Through a Website Building Review

    There is a lot of software that are specifically designed for creating websites. This makes it hard for one to choose which website building software to use. There are website builders which are perfect for e-commerce websites while some are fit for small business owners or those who make websites as a form of hobby. To help you find the right website builder, you can check out some of the website building review available online.

    Read the article

  • Domain mapping issues

    - by Nadya
    I have two domain names - .com & .co.uk bought with 123-reg and just one student Windows hosting pack associated with the .co.uk domain. The .com domain is the main one which people would be trying to access, so I just mapped the domain to the hosting this morning. The problem is that I would really like it to be functional by tomorrow morning and the usual waiting time is 24-48 hours. Is there point in stopping the process and trying with forward it with CNAME record instead, does it take less time? (I can just go back and do proper domain mapping during the weekend) Also, is there a possible way to check whether the domain mapping has been done correctly before these 24-48 hours? From some computers I get 404 Error on homepage.

    Read the article

  • How much effort is involved in moving a WordPress site to a private server? [on hold]

    - by Alan
    I work in tech, but am on the business side. I have a WordPress site that I would like to move to a personal server and associate with a new domain name. I already have a server (actually, a friend is letting me use his) and the domain name. A friend-of-a-friend, who claims to be an IT pro, has agreed to help, but now is asking for what feels like a lot of money for what he says is a pretty time-intensive job. This doesn't sound right to me, so I thought I would ask here: Would it take months or even days to move the content, and why would it have to be moved in stages? The blog currently uses a basic template and has about 1000 posts. How much effort is really involved in moving a WordPress site from one server to another? Can anyone explain the process? Would it just make more sense to point the domain name at the existing WordPress blog, and pay the nominal yearly fee? I appreciate any answers you can provide.

    Read the article

  • Lining things up while using columns

    - by Charles
    I have a request that may not be possible. I'd like to line up the elements of a form so that the inputs all start at the same place: Name: [ ] Company: [ ] Some question with a long name: [ ] But my list is (somewhat) long and I would like to show them in multiple columns on screens that are wide enough. Ideally, I'd find a POSH method (table-free is semantically appropriate, I think) that works on a reasonable number of browsers. My current page uses a table. I tried CSS with columns: auto; -moz-column-count: auto; -moz-column-width: auto; -webkit-column-count: auto; -webkit-column-width: auto; but Firefox (at least) won't break a table across columns.

    Read the article

  • hPalm and a web-centric strategy

    The acquisition battle has come and gone, and it’s HP that’s become Palm’s new owner. In general this news has been greeted withgladcries,despite (or maybe because) it was so unexpected. In general everybody assumes that the marriage of Palm software and HP hardware will be a good one, and that HP will also release a webOS-based tablet device.However, there’s an interesting dissenting opinion on VisionMobile (a blog I highly recommend, by the way). Guest author Michael Valukenko sees few synergies...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Making a Fun Website Web 2.0 and Beyond

    If you think that making a website that is informative and easy to navigate is going to bring targeted visitors to your website, guess again. You have to make it fun and captivating because you want your visitors to come back and see you again.

    Read the article

  • The SQL Server Setup Portal

    - by BuckWoody
    One of the tasks that takes a long time for the data professional is setting up SQL Server. No, it isn’t that difficult to slide a DVD in a drive and click “Setup” but the overall process of planning the hardware and software environment, making decisions for high-availability, security and dozens of other choices can make the process more difficult. And then, of course, there are the inevitable issues that arise. Microsoft supports literally hundreds and even thousands of combinations of hardware and software drivers from vendors you’ve never even heard of. Making all of that work together is a small miracle, so things are bound to arise that you need to deal with. So, to help you out, we’ve designed a new “SQL Server Setup Portal”. It’s a one-stop-shop for everything you need to know about planning and setting up SQL Server. As time goes on you’ll see even more content added. There are already whitepapers, videos, and multiple places to search on everything from topic names to error codes. So go check it out – and if you have to do a lot of SQL Server Setups – and especially if you don’t – bookmark it as a favorite! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Virtual Private Server Web Hosting Services

    There are many reasons why an organization requires a virtual private server. A Virtual Private Server hosting plan can be the solution to all of your needs. The best way to decide would be to access... [Author: John Anthony - Computers and Internet - May 11, 2010]

    Read the article

< Previous Page | 326 327 328 329 330 331 332 333 334 335 336 337  | Next Page >