Search Results

Search found 6362 results on 255 pages for 'django urls'.

Page 179/255 | < Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >

  • Open screen and run some projects and applications

    - by trex
    I am a python web developer, I need to run my local 3-4 django projects in screen sessions and need to launch some of my applications like skype, chrome, eclipse and a text file daily status.txt. Is there any way to write a script to launch all of them by running a shell script only? #!/bin/bash # gnome-terminal -e "screen -dmS myapps" #(Attach following command to one of the screen) cd /var/opt/project1 python manage.py runserver 127.0.0.1:8001 #(Attach another command to one of the screen) cd /var/opt/project2 python manage.py runserver 127.0.0.1:8002 #(Attach another command to one of the screen) cd /var/opt/project3 python manage.py runserver 127.0.0.1:8003 #start my applications eclipse skype gedit "/home/myname/Desktop/daily status.txt" [...] Can one help me to write a shell script to do this.

    Read the article

  • How to get rid of crawling errors due to the URL Encoded Slashes (%2F) problem in Apache

    - by user14198
    The Google web crawler has indexed a whole set of URLs with encoded slashes (%2F) for our site. I assume it has picked up the pages from our XML sitemap file. The problem is that the live pages will actually result in a failure because of the Url Encoded Slashes Problem in Apache. Some solutions are mentioned here We are implementing a 301 redirect scheme for all the error pages. This should make the Google bot delete the pages from the crawling errors (no more crashing pages). Does implementing the 301s require the pages to be "live"? In that case we may be forced to implement solution 1 in the article. The problem is that solution 1 will pose a security vulnerability..

    Read the article

  • How relevant is PHP today for browser games?

    - by Bitgarden
    I was the lead developer of 2 moderately successful browser games quite a few years back, and plan on working on a new game soon. At the time, I wrote them in pure PHP (no template engine or anything of the sort). I'd like to start working on a new game, but have been out of the web development world for a while. Reading around, I hear a lot of good about Rails, Django, Node.js, etc., with which I have no experience (although I know my way around Python, Javascript, and the others quite well). So my question is the following- if I were to go in my old ways and go with PHP again, would I be making things hard for myself? Would picking something more "trendy" have a real impact on my development? In addition, does anyone have any pointers relating to specifically developing browser games with these more modern tools?

    Read the article

  • What to choose for beginner: PHP/Python/Ruby

    - by Nai
    I'm a beginner teaching myself to code but I would like he insight of the PSE community and helping choose where to start. My main objective is to be able to create a basic website to first test my business idea and from there iterate on it quickly to minimise my learning time. The most important criteria for me is speed. An example of speed would be pre-built components available open source and not having to write one from scratch. From my research, this seems to be a death match between the following languages and frameworks: PHP and CakePHP Python and Django Ruby and Rails Assumptions: I am going to be equally good (or bad) in all 3. It is going to be equally easy to find competent developers in either language. I know this to be false already by lets assume that it is. This question is not meant to karma whore as I've seen how passionate some of these standoff questions have been and I'll be happy to turn it into a community wiki.

    Read the article

  • best/simplest way to inform search engine of sitemap location

    - by Don
    AFAIK, there are 2 ways to make search engines aware of a sitemap's location: Include an absolute link to it in robots.txt Submit it to them directly. The relevant URLs are: http://www.google.com/webmasters/tools/ping?sitemap=SITEMAP_URL http://www.bing.com/webmaster/ping.aspx?sitemap=SITEMAP_URL Where SITEMAP_URL is the absolute URL of the sitemap. Currently, I do both. Regarding (2), I have a job that runs automatically every day which submits the sitemap to Bing and Google. I don't think there's any reason to do (1) and (2), but I'm paranoid, so I do. I imagine you can avoid both (1) and (2) if you just make your sitemap accessible at a conventional URL (like robots.txt). What's the simplest and most reliable way to ensure that search engines can find your sitemap?

    Read the article

  • Seo Google Publisher Network

    - by Andy
    I'm just about to start a new business which creates niche affiliate sites. I'm curious about the impacts to me from Google of all the urls being hosted with the same analytic tags, webmaster tool tags and server ip ranges. To benefit the most from google's serps should i have each domain within seperate analytic accounts and webmaster tools or is it ok for me to have all of my domains within one account. My issue is duplicate content and the fact that i am building a publisher network and i'm not sure how much google likes them. I'm notoriously bad at searching and as such havent found what i'm looking for yet. Any help would be very much appreciated.

    Read the article

  • How do I interpret direct traffic that lands on random pages?

    - by mfg
    Looking at yesterday, according to Google Analytics, I got six direct visitors to my site (their source/medium is direct/(none)). Only one ended up at the actual domain. The other five ended up at miscellaneous foo.com/xyz.html. I did not send out links to people by email, and I'm not sure how likely it is the people would have copy/pasted the URLs. How do the visitors end up there? Is there a way to better capture where they might be coming from?

    Read the article

  • is jargon related to a frameWork (concept)

    - by MaKo
    If this is not the right place to ask this question please inform where it would belong, to change it... I have a doubt for the correct word or concept in english language [not my native], about the relationship of language to framework for example i work with objective C, with the cocoa touch frame work || python with the django frame work My comparison is between natural languages and formal languages, So would be in a natural language english and the frame work a [computer, it]jargon? Does this make sense? Or what other concept would be the relationship between natural language - framework?

    Read the article

  • A lot of 302 redirects

    - by user3651934
    I have a website for which one month stat shows: Unique Visitors 6274 Total Visitors 7260 Pages visited 9520 Hits 88891 Whats concerns me about is the HTTP status code: 302 Moved temporarily (redirect) 36302 How come 40% hits are being redirected. If it is not normal, what could be the possible reasons? ------------------------ adding more information ------------------------ Ok, here is the code I'm using in my .htaccess file for clean URLs. Is this causing as many as 36302 redirect hits? RewriteCond %{REQUEST_FILENAME}.php -f RewriteRule ^([^\.]+)$ $1.php [L] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^(.+[^/])$ $1/ [R] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?page=$1 [L,QSA] RewriteRule ^(.*)/$ index.php?page=$1 [L,QSA]

    Read the article

  • IIS 6 nested virtual directory redirection

    - by threedaysatsea
    We're running IIS 6 on a WinServer2k3 box and we're having some trouble with the following problem: E-mails were sent out to users asking them to go to the following URL: alias.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue However, the URLS are actually supposed to be: server.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue It's too late to recall all of the e-mails, and we'd like to redirect traffic to make this as seamless as possible for our users. The real problem here is that the server (server.contoso.com) is hosting the alias (alias.contoso.com) as a redirect thusly, and the existing redirect we need to keep functional: Default Web Site (server.contoso.com) --Directory1 --Directory2 --Directory3 Redirection to Directory3 (alias.contoso.com) --Essentially alias.contoso.com will take the user to server.contoso.com/Directory3 Is there any way to host a separate redirect inside of the existing redirect? We need to keep alias.contoso.com taking the user to server.contoso.com/Directory3 but also make alias.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue point to server.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue Any tips? Is this even possible?

    Read the article

  • Content Optimization only?

    - by danie7L T
    There are tons of discussions around tips&tricks to improve Search Engines "ranking" and SEOs. What if the focus of the webmaster/client is 100% set on the quality of the content with precise keywords in meta tags, clean design, regular articles updates, clean URLs and highly filtered external links leading to pages on websites dealing on the same,or related subjects; isn't it the job of a good search engine like Google to catch this website and show it in its front-page ? Or does Search Engines count on us to help them find us, and webmasters will always have to be up-to-date regarding SEO tools and rules updates on top of websites' design, browsers customization, progressive enhancement etc ?

    Read the article

  • Extracting meta tags attribute using wget [migrated]

    - by Amit
    I have a file having some URLs per line. I need to extract the "keywords" present in the tags i.e. if there is meta tag for "keywords" then i want to get "content" value for it. Example: if the web-page has this meta-tag then for that URL i want "wikipedia,encyclopedia" to be extracted. One approach is to download the web-page using "wget" and then parse it using some standard HTML parser. I was wondering is there any better way to do this without downloading the entire web-page.

    Read the article

  • How will this affect my SEO ranking?

    - by dunc
    I run a fishkeeping website based on a WordPress (PHP) CMS. I've recently put a fairly complex "filter" into place which searches my content for mentions of fish species profiles and turns them into an active link. For example, asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and A. panduro and Apistogramma panduro ...becomes asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and <a href="/?p=1703" class="link_species">A. panduro</a> and <a href="/?p=1703" class="link_species">Apistogramma panduro</a> On the rest of my website, the species are linked with pretty URLs such as /species/apistogramma-panduro/ but due to the way this filter works, the only information I can get access to is the idof the post. As such, I'm using /?p=1703 or whatever the ID is. What I'd like to know is: how much will this affect my SEO rating/ranking? Will it be detrimental if I don't rewrite the function? Thanks in advance,

    Read the article

  • Moved sitemaps to a different subdomain and losing search referrals around the same time. Red herring or correlation?

    - by er1234
    We started to lose search referral traffic around the same time that I moved some of our sitemaps to a subdomain. Could this have hurt us? I followed Google's steps to creating a sitemap under a different subdomain. The new sitemaps.foo.com subdomain is being crawled and indexed well. Both www.foo.com and sitemaps.foo.com have been verified in Google Webmaster Tools. They appear as distinct sites. Is this correct? I can't find a way in Webmaster Tools to say "Hey, sitemaps.foo.com is really owned by www.foo.com, so show them together and make sure to attribute sitemaps.foo urls to www.foo" Our www.foo.com/robots.txt Sitemap: http://www.foo.com/sitemap.xml Sitemap: http://sitemaps.foo.com/subdir/sitemap.xml.gz

    Read the article

  • How to generate "language-safe" UUIDs?

    - by HappyDeveloper
    I always wanted to use randomly generated strings for my resources' IDs, so I could have shorter URLs like this: /user/4jz0k1 But I never did, because I was worried about the random string generation creating actual words, eg: /user/f*cker. This brings two problems: it might be confusing or even offensive for users, and it could mess with the SEO too. Then I thought all I had to do was to set up a fixed pattern like adding a number every 2 letters. I was very happy with my 'generate_safe_uuid' method, but then I realized it was only better for SEO, and worse for users, because it increased the ratio of actual words being generated, eg: /user/g4yd1ck5 Now I'm thinking I could create a method 'replace_numbers_with_letters', and check that it haven't formed any words against a dictionary or something. Any other ideas? ps. As I write this, I also realized that checking for words in more than one language (eg: english and french, spanish, etc) would be a mess, and I'm starting to love numbers-only IDs again.

    Read the article

  • Does using a PHP framework count as experience using PHP to a company that doesn't use that framework?

    - by sq1020
    I've started working at a company that uses the Yii PHP framework. I'm mostly using Yii but also some frontend stuff like jQuery and Ajax. What I'm worried about is limiting my skill set to a framework that isn't very popular. I mean, if the company I worked for was using Ruby on Rails or even Django, I wouldn't have this feeling of concern for the future. My first question is then, in regards to being able to find a job in the future somewhere else, is my feeling of concern warranted? Secondly, I see a lot of PHP jobs out there but do you think experience using a PHP framework counts as valuable experience to a company that doesn't use that particular framework or any framework at all?

    Read the article

  • Which language is productive for high phase business application development? [closed]

    - by Nizar
    If we (I and my friends) would like to build web-based products and sell it using a license approach (to renew every year for example). Which server-side language will be most suitable for our purpose? We could target the following audience: - Personal sites. - Serious small-medium companies (to sell prducts such as Help-Desks, Forms,etc.) - Restaurants (to sell online order web applications). We would like to - attract as many customers as possible. - provide updates for our prodcuts (for our customers). - make our products easy to use. There are number of open-source frameworks and languages that has potential to handle our business problems (like Django, Python, Java, etc..) However, we are not sure which one is easier to learn and has variety of tools/plugins to help us in development process. Thus we need to get you experience on this hard to decide matter. Which language and its supporting framework we should choose ?

    Read the article

  • Most Useful New Technology?

    - by Craig Ferguson
    I'm looking to take a sort of sabbatical, and I'd love to use it to learn a new technology. My question is this: What's the most useful "new" technology for a software engineer to use? Node.js, iOS programming, Android, something else? I'd prefer to stay away from anything too new or experimental, since those are, in my experience, rarely actually used in professional production environments (for better or worse). Does anyone happen to have stats on how many jobs there are for each new technology or have anecdotes about how fun each one is? I've been using python/Django, so that's out, and it's similar to Ruby so i don't think learning Ruby would be that useful to expanding my skills. Anyone have any other ideas?

    Read the article

  • How can I disable the prefetch cache?

    - by Oli
    I run a few Ubuntu servers that have a load of django sites running on them. The sites and the httpd start at boot and after that (apart from me SSHing in to update it or using bzr to update websites) nothing else gets run on it. At the moment over half the ram is allocated as cache. This isn't a problem because cache usually makes space or a little bit of it slips into swap (again, this doesn't really bother me) but I don't see the need for it. Is there a quick way to disable the cache? This is more of an experiment than anything else so it would be handy to know how to turn it back on again.

    Read the article

  • SEO penalty for landing page redirects

    - by therealsix
    Using ebay as an example- lets say I have a large number of items whose URLs' look like this: cgi.ebay.com/ebaymotors/1981-VW-Vanagon-manual-seats-seven-/250953153841 I want to give my client the ability to put links to these items on their website EASILY, without knowing or checking my URL. So I created a redirect service that will map their identifier with my URL: ebay.com/fake_redirect_service/shared_identifier9918 would redirect to the link above. This works great- my clients can easily setup these links with information they already have, and the user will see the page as usual. So on to the problem... I'm concerned that this redirecting service will have a negative impact on my SEO ranking. Having a landing page redirect you immediately to a different URL seems like something a typical spam site would do. Will this hurt me? Any better solutions?

    Read the article

  • Length of Page Title, URL, Meta Description and total number of links on a page

    - by MJWadmin
    We've been examining a number of different SEO tools recently. Several of these tell us that some of our page title's, urls and meta descriptions are too long. We've also been told that some of our pages have too many links on them. I guess our first question is - is any of that feedback true! Can URL's etc actually be too long and if so how much does this affect ranking? Secondly can you have too many links on a page and if so, how many is too many? Thanks in advance...

    Read the article

  • Google Site Search (commercial) not indexing files in sitemap

    - by melat0nin
    I have a client for whom we have purchased Google Site Search. It works well for HTML pages served by the CMS, but files aren't being reliably indexed. I wrote a script to generate an XML feed (sitemap) of all the files in the CMS which I've plugged in to Google Webmaster Tools for the site. It says that for that sitemap 923 URLs have been submitted, but only 26 have been indexed. The client relies heavily on searching within files, which is why we decided to use Google search, so this is a bit of a problem. Many of the files aren't linked to from any page on the site, as they are old and therefore don't merit having a page of their own. But they still need to be accessible through search for archiving purposes. The file archive xml can be found at www.sniffer.org.uk/file-archive and the standard xml sitemap (of pages) can be found at www.sniffer.org.uk/sitemap.xml. Any thought would be much appreciated!

    Read the article

  • Ubuntu Desktop or Ubuntu Server?

    - by Twinborn
    Hello everyone. I'm new to Linux and Ubuntu is my first distribution. I have chosen Ubuntu because I want to learn more about Linux. I'm under time constraints and need to setup a server as soon as possible. I have Ubuntu Desktop and Ubuntu Server setup via VMWare Fusion on my MacBook Pro. I installed everything I need on the server edition, but it feels way over my head. I have no experience with CLI. Can I just use Ubuntu Desktop to run my Web server for the time being while I learn CLI? I basically need to run Apache, PHP, MySQL, phpMyAdmin, Python and Django. Should I be using MAMP? Thanks.

    Read the article

  • What are the different branches of Programming? [closed]

    - by clueless
    I just want a very general overview about what are the actual 'branches' of programming in the industry. What are different paths one can choose as a programmer and what are the common frameworks/languages/platforms in those paths. Currently I'm well versed with C/C++ and Python and I'm a beginner with Django. I want to know this because I can't decide what to proceed with after this, which route to take. Hope it's not a very general question. Thanks!

    Read the article

  • BleachBit: How to Completely Clear URL History in Firefox?

    - by tSquirrel
    14.04 / Firefox 29.0 I've been using Bleachbit to clear usage/file history, and for the most part it works great. However, it doesn't seem to clear the website hostnames out of the URL, at all. These addresses are not bookmarked. Also, the total URL isn't preserved, just the hostname. Visit site http://www.bluesnews.com/some_random_URL_string Exit Firefox Run Bleachbit, with ALL Firefox options selected Restart Firefox Check history: completely empty, other than bookmarked sites. www.bluesnews is NOT bookmarked Type "blue" which is Firefox automatically completes as "http://www.bluesnews.com/" Alternate Step #3: Use Firefox's built-in "Clear History" and select ALL entries with a time frame of "Everything". Same result as above. My inquiry in BB forums hasn't been responded to. I found Dan's proposed solution, however changing autocomplete in about:config only turns off the function, it doesn't actually stop storing URLs.

    Read the article

< Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >