Search Results

Search found 7807 results on 313 pages for 'dreamweaver sites'.

Page 27/313 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • Developing web sites that imitate desktop apps. How to fight that paradigm? [closed]

    - by user1598390
    Supposse there's a company where web sites/apps are designed to resemble desktop apps. They struggle to add: Splash screens Drop-down menus Tab-pages Pages that don't grow downward with content, context is inside scrollable area so page is of a fixed size, as if resembling the one-screen limitation of desktop apps. Modal windows, pop-ups, etc. Tree views Absolutely no access to content unless you login-first, even with non-sensitive content. After splash screen desapears, you are presented with a login screen. No links - just simulated buttons. Fixed page-size. Cannot open a linked in other tab Print button that prints directly ( not showing printable page so the user can't print via the browser's print command ) Progress bars for loading content even when the browser indicates it with its own animation Fonts and color amulate a desktop app made with Visual Basic, PowerBuilder etc. Every app seems almost as if were made in Visual Basic. They reject this elements: Breadcrumbs Good old underlined links Generated/dynamic navigation, usage-based suggestions Ability to open links in multiple tabs Pagination Printable pages Ability to produce a URL you can save or share that links to an item, like when you send someone the link to an especific StackExchange question. The only URL is the main one. Back button To achieve this, tons of javascript code is needed. Lots and lots of Javascript and Ajax code for things not related with the business but with the necessity to hide/show that button, refresh this listbox, grey-out that label, etc. The coplexity generated by forcing one paradigm into another means most lines of code are dedicated to maintain the illusion of a desktop app. What is the best way to change this mindset, and make them embrace the web, and start producing modern, web apps instead of desktop imitations ? EDIT: These sites are intranet sites. Users hate these apps. They constantly whine about them, but they have to use them to do their daily work. These sites are in-house solutions, the end-users have no choice but to use them. They are a "captive audience". Also, substitution will not happen because of high costs. But at least if that mindset is changed, new developments would be more web-like.

    Read the article

  • Is there a tool for detecting sites using the same template?

    - by KTB
    I often buy webtemplates online, and when I do so I often look at the demos to get some inspiration on how to use the components. But I would love to look at other sites which have already implemented a full website using this template. So I am looking for a tool which searches sites having a similar HTML as the demos (and therefore are probably implementing the template). I am referring to templates which do not have a static text like "Created with Template FooBar by BarFoo" in the footer.

    Read the article

  • Indexing Bilingual Sites

    - by Affar
    Hi all I have a site that supports two languages. The way the user changes the language is by clicking on a language link that will change his session language from A to B and return him to the same page. The problem that I am facing is that Google doesn't index the other language since it is seeing that the language link is the same as the page link but with changes in the parameters. So is there a way to inform google of the language, or should I add the language in the url, i.e. www.example.com/A/home?

    Read the article

  • Static methods on ASP.NET web sites

    - by Grant
    Hi, i was wondering.. if i have a static method on an asp.net web site (plain vanilla), is that accessible by all users of all sessions? I guess what i am saying is the single instance of a method available to each client? or is there 1 instance for all clients for the site..

    Read the article

  • Correct folder structure for sites

    - by Francesca
    I've just started tidying up the server for a particular site and am running into some trouble when moving files. I originally had style.css in my main folder, and another folder called images, so paths in the .css went images/myimage.png Now I have moved style.css into another folder called css. So the image links in the CSS no longer work, as it's looking for the folder images from it's own CSS folder. I changed the file path to /images/myimage.png as I thought this would make it climb up a level and then look for the images folder, but this doesn't seem to work. I'm interested to see what solutions people have and also any suggestions on how people organise their folders for a particular site. Thanks!

    Read the article

  • Script that replaces strings doesn't work on some sites

    - by groovy354
    I've created a simple Chrome extension that seeks for certain strings using regex and replaces matches with predefined text. It works well on most websites, but somehow the script doesn't take effect on, for example, Lifehacker (like this page http://lifehacker.com/5939740/five-best-audio-editing-applications?popular=true ). The code is: $('p, h1, h2, h3, span, .content, .post-body').each(function(){ //do something with $(this) }); Any ideas why is Lifehacker's site resistant to my script?

    Read the article

  • how i can send date from site to other sites

    - by moustafa
    Hi, Im not much of a php coder, mainly use VB. But i had a problem with one of my apps. To make it more secure i would need each php parameter to go through one site. Here is an example of what i mean: Application loads sends ip and location to 2 servers (a.php & b.php) the problem so far is that the pc is making direct connections to these pages. What i was trying to do is make it so that it only sends one command to z.php and the page z.php would send the data to a.php and b.php. My question is how would i set up z.php? I hope i make sense, i have looked everywhere and couldnt find an answer.

    Read the article

  • sites integration

    - by bassem ala
    hey anyone had this task before ? suppose you have a website called www.a.com and a second www.b.com the the a.com have a button. when you logged in to a.com and click the button you will be send to b.com the question is how can i do this while keeping the credentials of the user when he is sent to b.com i want him automatically logged in to b.com ...PS: every user has an account in a.com he definitely should have an account in b.com .... any ideas please???.... thank u for you ideas and supports :)

    Read the article

  • Azure : Mobiles Services et Web Sites entrent en production, l'infrastructure stocke 8,5 trillions d'objets et gère 900 000 transactions par seconde

    Windows Azure : Mobiles Services et Web Sites entrent en production L'infrastructure stocke 8,5 trillions d'objets et gère 900 000 transactions par secondeDisponible en Preview depuis août 2012, Windows Azure Mobiles Services est passé en disponibilité générale (GA) avec Windows Azure Web Sites. Une étape qui marque l'entrée de ces services en phase de production. Pour rappel, Windows Azure Mobile Services est une plateforme Backend as a service (BaaS), qui fournit une solution clef en main dans le Cloud, permettant d'accélérer le développement d'applications connectées côté client.

    Read the article

  • session_set_cookie_params on multi-domain sites

    - by nillls
    Hi! I'm currently developing for an application (www.domain.se, .eu) where we're experiencing problems with sessions not propagating across domains. Internet Explorer is the root cause of this, as it will differentiate sessions depending on whether we're typing in "domain.se" or "www.domain.se". Due to some unfortunate redirecting, we're not able to keep the user on the same address the user typed in, instead we're always redirecting to www.domain.se on login. Needless to say, IE users can not login when typing "domain.se". To make this error go away, we implemented a function to try and set the session to be valid across all possible domains by doing the following: if($_SERVER['HTTP_HOST'] == "domain.se") { session_set_cookie_params(3600, '/', '.domain.se', true); } There are basically a few if:s that we go through depending on what address the user typed in, but the third argument stays the same. This, however, results in no-one being able to log in, regardless of domain. I've tried reading up on how session_set_cookie_params() works but to no avail. Any help is greatly appreciated!

    Read the article

  • Are there any scripts to synchronize sites?

    - by Matrym
    I've just set up a fail-over DNS to switch the site to a second host if the first is down. This is great for showing an old / archived version of the site, but I suspect maintenance is going to be a real pain. I moved the files over with rsync in the first place. Is this the kinda thing that could be run as a cron job, automatically moving over newer files?

    Read the article

  • COM+/Desktop Heap errors in IIS affecting sites at random?

    - by tresstylez
    We have a Win2K3 server that is hosting 30+ sites. Each site is configured to have its own unique application pool -- so that we can manually recycle specific sites if needed and not kill sessions for the others. From what I've read, the consequence of this type of setup is that each application pool worker process gets allocated a Desktop Heap (normally 512 kb's) and we limit the number of app pools we can serve. http://blogs.msdn.com/b/david.wang/archive/2006/01/25/security-considerations-of-usesharedwpdesktop-on-iis6.aspx PROBLEM: What we're seeing is that occasionally COM+ errors get triggered, presumably by hitting our 512 kb limit of the desktop heap -- and certain sites become unresponsive (or have errors) until we manually recycle that specific app pool. I know that I can increase the desktop heap limit to 1024, and make other tweaks/tunes, but I've been tasked with finding out what exactly causes one site's heap to max out as opposed to another. It seems that when we start seeing COM+ errors, the sites it affects are random -- small sites or big sites (heavier used). Is it based on process id? Traffic? Any pointers on understanding this a little more would be excellent. Thanks! jg

    Read the article

  • How do sites avoid SEO issues / legalities with subdomain unique ids?

    - by JM4
    I was looking through a few websites recently and noticed a trend I'm not sure I understand. Sites are creating unique referral URLs for customers in the form of: http://customname.site.com (If somebody were to use http://www.site.com/customname it would function the same way). I can see the sites are using 302 redirects at some point using Google Chrome then doing some sort of htaccess redirect, taking the subdomain name (customname) and applying it as a referral parameter then keeping in session during the entire process. However, there must be thousands of these custom URLs that people are typing in. How are each one of these "subdomains" not treated as separate URLs which in turn are redirected to the same page (in short, generating tons of links all pointing to the same page which Google would normally frown upon)? Additionally, the links also appear on the site themselves as clickable links so I'm not sure how these are not tracked. Similarly, the "unique" url is not indexed or cached in any Google search results. How is this capability handled? It does NOT highlight the referral aspect, but a true example of this is visiting http://sfgiants.com which does a 302 redirect to the much longer proper San Francisco Giants MLB homepage. I am wondering how SFgiants.com is not indexed (assuming that direct shortened link appears on several MLB pages)? 1 - I know these are 302 redirects, I can see this on the sites network flow. 2 - These links do in fact appear on the page itself because in some areas (for example, the bottom of the page may say: send this page to a friend! http://name.site.com/ which in turn would again redirect to something like http://www.site.com?id=name so the id value could be stored in session

    Read the article

  • A frequently updated mixed bag blog OR several seldom updated niche sites?

    - by Melanie
    Background I am a member of the website HubPages where I have about a hundred articles (and I'm always writing more.) Anyway, HubPages revenue model is 40% ad-share for them and 60% ad share for users. While the userbase there is really friendly, the site is REALLY slow, buggy and there is a ton of content on HubPages that is copied from other sources. Upon flagging these articles it takes a ton of time for mods to remove it and it's just generally dragging down my stuff. Furthermore, HubPages was hit really hard by Google's Panda Update: http://www.google.com/search?hl=en&rlz=1B3GGLL_enUS426US426&tbm=nws&q=google+panda& Aside from the temporary problems I would deal with when removing content from HubPages and putting it on my own domain (duplicate content, etc) I have another problem. Which would be the best for my articles? I have tons of articles in a wide variety of niches and would like to do what would help them perform the best. I'm not a huge niche writer and have received wide criticism from the HubPages community for my articles not performing as well as they could because I don't use enough keywords within the text of my articles. I prefer to write more naturally in a way that would appeal to an audience instead of keyword stuff. Anyway, this is aside the point. My Question After removing my articles from HubPages, should I put them on one domain or spread them across multiple domains grouped sort of by topic. For example: a-bunch-of-articles.com OR travel-articles.com and financial-articles.com and knitting-articles.com (I know those domains aren't available, but it's just kind of an example.) Here are the pros and cons of each: a mixed bag site like a-bunch-of-articles.com may not perform as well because of its mixed-bag nature a mixed bag site would be updated far more frequently than several niche sites... some niche sites may be updated so infrequently that a year could pass before one sees a new article a mixed bag site would be like putting all my eggs is one basket, where having several niche sites would spread out my portfolio, so to speak. a mixed bag site would be cheaper, $14 (two year registration) to start out with and hosting and I'm good to go. a mixed bag site wouldn't allow me to easily target keywords, but then again isn't HubPages pretty much a mixed bag site?

    Read the article

  • Is there any negative impact with similar page titles and descriptions on similar sites?

    - by ElHaix
    Currently we have Canadian versions of some websites. We are going to create some American versions, which essentially have everything the same, except the search results are geo-specific to the USA. The format for the results page title and descriptions will remain the same, ie {0} in {1} | Find more {0} etc etc etc... {1}. The search term will most-likely be the same between both sites. Will the relative similarity in the page titles and descriptions between the CDN and USA sites have any negative SEO impact, where the geo location would be the most significant difference?

    Read the article

  • JavaScript serait le langage de programmation le plus populaire selon l'indice RedMonk, basé sur GitHub et les sites d'entraide

    JavaScript est le langage de programmation le plus populaire Dans le classement RedMonk, basé sur GitHub et les sites d'entraide Entre popularité et préférence, quel est le meilleur critère pour classer les langages de programmation ? Le bureau d'analyse RedMonk les classe en fonction de leur popularité au sein de la communauté des développeurs sur les sites GitHub et StackOverflow. Le classement de RedMonk révèle un tout autre résultat comparé au classement par l'indice de TIOBE que nous relevons périodiquement sur Developpez.com. En effet, chez RedMonk on retrouve JavaScript en tête de liste, suivi de près par Java, PHP...

    Read the article

  • How can I create multiple mini-sites with similar/duplicate content without hurting my search engine rank?

    - by ekpyrotic
    Essential background: I run a small company that lets members of the public post handwritten letters to their local politician (UK-based). Every week a number of early stage bills (called Early Day Motions) are submitted for debate in the House of Commons, and supporters of the motion will contact their local Members of Parliament, asking them to sign the motion. The crux: I want to target these EDMs with customised mini-sites, so when people search "EDM xxx", they find my customised mini-site, specifically targeting that EDM (i.e., "Send a handwritten letter to your MP asking them to sign EDM xxx"). At the moment, all these mini-sites (and my homepage) have duplicate content with only the relevant EDM name, number, and background image changed. (For example, http://mailmymp.com and http://mailmymp.com/edm/teaching-life-saving-skills-at-school-edm-550.php). The question: Firstly, will this hurt my potential search engine ranking? And, if so, what's the best way to target these political campaigns in an efficient manner without hurting my SEO prospects?

    Read the article

  • My Sites Were Hacked. What To Do?

    - by Vad
    I host multiple domains with this very popular hosting provider and I just went into one of my sites and... I see a black page with message "Hacked by...". I checked and all my sites with the provider are showing this same page. Inside of file system I have seen the hacker placed all default.* and index.* files with this message. So the hacker overwrote all index pages, placed new pages and that is under every, I say again, every folder. Cleaning this up will be close to a most horrible job. What to do (right now I am awaiting the restore of files from hosting provider)? How to prevent this? Whom to blame?

    Read the article

  • SmS Gateways - How do other sites do it? [closed]

    - by chobo2
    Possible Duplicate: Send and Receive SMS from my Website I would love to have a feature on my site that sends Email reminders and SmS(text messages) to people mobile phones. I been searching around and all I am finding is api's that charge money per SmS message(as low as 1cent per message). However even at 1cent per message that is still too much. The amount of money I am charging per year could be servilely eroded by just the Sms messages along. I could of course charge more money for my service or have an add on for SmS messages but I don't think either would work as most people expect it to be free feature and if they have to pay anything that is because of their carrier charging them not the website. How do other sites do it? I guessing companies like google have their own gateway providers or something like that. But how about smaller sites what do they do? I can't see them paying per sms text message.

    Read the article

  • Is there a (free) reliable place to get statistics from sites, more reliable than Alexa, Quantcast, Compete?

    - by S.gfx
    I mean, seems there's no way. I am just asking in case someone knows of a recent new site being more accurate. I am aware of Alexa's, Compete and Quantcast inaccuracies and/or limited system/range of sites to get their stats. I also know about websitegrader perhaps being a little more accurate (although not sure if that's the data I am after). And read Seomoz tools are reliable. I am yet though looking for a free solution, a 'reliable' Alexa. And not a place depending on a toolbar installation, an easy to trick place, or one with stats way too off, or of a very limited range of sites. I am almost sure there's nothing new, but I wanted to be sure.

    Read the article

  • Google lance Tag Manager, un outil gratuit qui facilite la gestion du suivi et des balises de marketing des sites Web

    Google lance Tag Manager un outil gratuit qui facilite la gestion du suivi et des balises de marketing des sites Web Google vient d'annoncer le lancement de Google Tag Manager, son nouvel outil pour la gestion des différentes balises dans un site Web. Pour mieux monétiser leur site Web et contrôler la manière dont le contenu est utilisé, les gestionnaires de sites ont recours à des outils de suivi de statistiques comme Google Analytics. Pour chaque service, un morceau de code doit être intégré dans chaque page du site. Bien que d'une utilisation relativement simple, la multiplication de ces bouts de code sur une page peut rendre leur gestion fastidieuse. De plus, les requêtes entr...

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >