Search Results

Search found 57665 results on 2307 pages for 'web next'.

Page 227/2307 | < Previous Page | 223 224 225 226 227 228 229 230 231 232 233 234  | Next Page >

  • dynamic urls and links on one web page

    - by John
    I am trying to figure out how to create dynamic links and urls on a static webpage. What I want to do is the following: I have a single webpage for example: MYWEBPAGEdotCOM/INDEX.HTML that will always look the same, except for one link on the page. the link would be on the page for example: LINK TO AFFILIATE: affiliatedotCOM/my-affiliate_code_here_DYNAMIC_REFERER the only thing would change is the "DYNAMIC_REFERER" with every dynamic url on this page: MYWEBPAGEdotCOM/INDEX.PHP_id=test1 MYWEBPAGEdotCOM/INDEX.PHP_id=test2 MYWEBPAGEdotCOM/INDEX.PHP_id=test3 MYWEBPAGEdotCOM/INDEX.PHP_id=test4 which would only hange the dynamic link on the page to: affiliatedotCOM/my-affiliate_code_here_test1 affiliatedotCOM/my-affiliate_code_here_test2 affiliatedotCOM/my-affiliate_code_here_test3 affiliatedotCOM/my-affiliate_code_here_test4 Can someone tell me how I could go about doing this? I just dont want to have to make 100's of pages, as this would prevent me from having to do so.

    Read the article

  • How can a large company foster excellence in its engineers?

    - by Joshiatto
    I am tasked with improving the skills (quality & speed) of engineers in my company. Here are some ideas: Pair Programming TDD Automated Check-in Policies Talks given by experts Awards for coding excellence Encourage competition among engineers to contribute to GitHub Publish standards and practices docs on the intranet site "Gamification" of engineering. Somehow make becoming badasses into a game they will enjoy playing Training Showcase github checkins on screens around the office Add an "engineer of the month" to the intranet home page How can I drive traffic to the intranet home page? What crazy futuristic idea would drive engineers to go to the page every day to see who of their peers are making more money than them (inferred via recognition) and then go off and improve their skills and productivity to see their standings improve on the home page??? Or any ideas just to foster collaboration and love for their jobs so they start taking more pride in their work?? Don't take my ideas as symptomatic of our org. I take full responsibility for not knowing the right way to do this.

    Read the article

  • Useful code paste site tools

    - by acidzombie24
    I know there are site tools to check if your webpage is alive, has compression, etc but lets not get into that. What are useful sites to paste code in and to share links of? The three i know are http://codepad.org/ shows source and runs code online http://www.pastie.org/ share source with syntax highlighting http://jsfiddle.net/ great for JS help or for the occasional test. What else do you know of? One answer per question. I'll let lints and validators slide since you do paste code into them. Mention a weakness if you do know one so others wont be surprised or disappointed.

    Read the article

  • Robots.txt practices with .htaccess redirections (inherits)

    - by Jayhal
    I have a question regarding how to write robots.txt files for many domains and subdomains with redirects in place. We have a hosting account that enacts primary and add-on domains. All of our domains and subdomains, including the primary domain, is redirected via htaccess 301s to their own subdirectories in the primary domain's root directory. I'm confused about how I would write the robots.txt for certain directories. First, I wanted to confirm I am right in understanding that for domains and subdomains, crawlers will look to the directory that acts as that urls root directory for the crawling rules(robots.txt). Also, that a directory will not be affected by a robots.txt present in their parent directory if the directory has its own domain/subdomain, and that url is the one being accessed by crawlers. (Am pretty sure, but I wanted to confirm I didnt have a fundamentally flawed understanding of robots.txt) In the original root directory on the account(where the primary domain was directed before htaccess was put in place) what should the robots.txt contain? When crawlers look to crawl our primary domain, will they look to the original root directory for the robots.txt or will they reference the file contained in the new subdirectory where all the primary domain's site files are located? If so, what should the root's robot.txt include if anything at all. Would I be right to include a simple 'disallow: /' for all agents, and then include more specific robots.txt files in each subdirectory with more specific instructions. Would that affect the crawling of the directory where the primary domain is now redirected? Any help is greatly appreciated, Thanks!

    Read the article

  • What do Oracle VM Templates for JD Edwards EnterpriseOne 9.1 have to do with your Next Vacation!

    - by Monica Kumar
    Oracle VM Templates for JD Edwards EnterpriseOne 9.1 Update 2 and JD Edwards EnterpriseOne Tools 9.1.3.3 are now available for download from Oracle Software Delivery Cloud. So, what do Oracle VM Templates have to do with your next vacation? Well, how about time savings so you can plan for that next vacation and have the peace of mind since Oracle did the work and the testing for you!! What’s inside the new Oracle VM Templates release? The Oracle VM Templates for JD Edwards EnterpriseOne enables you to rapidly install JD Edwards EnterpriseOne. The complete stack includes: JD Edwards EnterpriseOne Applications Release 9.1 Update 2 JD Edwards EnterpriseOne Tools 9.1 Update 3, maintenance pack 3 (9.1.3.3) Oracle Database 11g Release 2 Oracle WebLogic Server 10.3.5 All pre-configured and pre-tested to run on Oracle Linux 5. Yes, the OS is included in the template! Oracle Business Intelligence Publisher 11.1.1.7.1 for use with JD Edwards EnterpriseOne One View Reporting JD Edwards EnterpriseOne Business Services Server and Oracle Application Development Framework (ADF) 11.1.1.5, for use with the JD Edwards EnterpriseOne Mobile Applications All pre-tuned to support up to 100 interactive users The templates can be installed to Oracle VM Server for x86 release 3.1 or later, to the Oracle Exalogic Elastic Cloud, and to the Oracle Database Appliance. Simply visit http://edelivery.oracle.com/oraclevm. Download and unzip the files and read the readme and you’re ready to go. How long would take you to install each of the components above, configure and tune them all from scratch? We know that you can get 7-10x faster deployment using the Oracle VM Templates. Now, how about that snorkeling trip to Belize!!

    Read the article

  • Basics of Hosting [closed]

    - by Bala
    Assume we know nothing about web hosting but need to get a site online. What questions do we need to ask potential web hosting companies? What are the pitfalls and places where things can go terribly wrong? Are there any general good or bad things to be on the lookout for? Site could be anything from basic HTML up to e-commerce. We're looking for general thoughts that could apply to any web hosting. Thanks!

    Read the article

  • DotNetNuke Development - The Right Tool For Web Development Today

    With the emergence of websites as the one of the primary modes of communication on the Internet, many tools have been developed to assist in creating sites that are capable of meeting the highest expectations of their visitors. This article discusses DotNetNuke development for developing sites for the new generation of visitors on the Internet.

    Read the article

  • Is dynamic HTML layout good from an SEO perspective?

    - by sll
    Just wondering whether dynamically built HTML layout is fine from SEO perspectives? So let's assume e-commerce engine and its most popular page - products catalog. So 90% of the page is built using AJAX and MVVM library knockoutjs which builds HTML on the fly on the client side. So how search bots would parse such content? Is it fine indexed and would be such effective as server-side built HTML pages from the SEO perspectives?

    Read the article

  • DotNetNuke Development - The Right Tool For Web Development Today

    With the emergence of websites as the one of the primary modes of communication on the Internet, many tools have been developed to assist in creating sites that are capable of meeting the highest expectations of their visitors. This article discusses DotNetNuke development for developing sites for the new generation of visitors on the Internet.

    Read the article

  • Firefox Fonts Blurry (Web Content ONLY, **NOT** Window Decorations or Other Programs)

    - by Lambert
    A picture is worth a thousand words... so does anyone know how to fix this font blurriness in Firefox? (You'll need to right-click the picture below go to View Image to view it full-size; it's too small to see anything here.) Note: My other applications (and the Firefox non-client area, as you can see in the screen) are completely fine, so obviously going to System-Appearance and changing the font settings isn't fixing the situation.

    Read the article

  • File Upload Forms: Security

    - by Snow_Mac
    SO I'm building an application for uploading files. We're paying scientists to contribute information on pests, diseases and bugs (for Plants). We need the ability to drag and drop a file to upload it. The question becomes since the users will be authicentated and setup by us, will it be necessarcy to include a virus scanner to prevent the uploading and insertition of malicious files. How important is this?

    Read the article

  • Inline Image in ASP.NET

    - by Ricardo Peres
    Inline images is a technique that, instead of referring to an external URL, includes all of the image’s content in the HTML itself, in the form of a Base64-encoded string. It avoids a second browser request, at the cost of making the HTML page slightly heavier and not using cache. Not all browsers support it, but current versions of IE, Firefox and Chrome do. In order to use inline images, you must write the img element’s src attribute like this: 1: <img src="data:image/gif;base64,R0lGODlhEAAOALMAAOazToeHh0tLS/7LZv/0jvb29t/f3//Ub/ 2: /ge8WSLf/rhf/3kdbW1mxsbP//mf///yH5BAAAAAAALAAAAAAQAA4AAARe8L1Ekyky67QZ1hLnjM5UUde0ECwLJoExKcpp 3: V0aCcGCmTIHEIUEqjgaORCMxIC6e0CcguWw6aFjsVMkkIr7g77ZKPJjPZqIyd7sJAgVGoEGv2xsBxqNgYPj/gAwXEQA7" 4: width="16" height="14" alt="embedded folder icon"/> The syntax is: data:[<mediatype>][;base64],<data> I developed a simple control that allows you to use inline images in your ASP.NET pages. Here it is: 1: public class InnerImage: Image 2: { 3: protected override void OnInit(EventArgs e) 4: { 5: String imagePath = this.Context.Server.MapPath(this.ImageUrl); 6: String extension = Path.GetExtension(imagePath).Substring(1); 7: Byte[] imageData = File.ReadAllBytes(imagePath); 8:  9: this.ImageUrl = String.Format("data:image/{0};base64,{1}", extension, Convert.ToBase64String(imageData)); 10:  11: base.OnInit(e); 12: } 13: } Simple, don’t you think?

    Read the article

  • In the Cloud, Everything Costs Money

    - by BuckWoody
    I’ve been teaching my daughter about budgeting. I’ve explained that most of the time the money coming in is from only one or two sources – and you can only change that from time to time. The money going out, however, is to many locations, and it changes all the time. She’s made a simple debits and credits spreadsheet, and I’m having her research each part of the budget. Her eyes grow wide when she finds out everything has a cost – the house, gas for the lawnmower, dishes, water for showers, food, electricity to run the fridge, a new fridge when that one breaks, everything has a cost. She asked me “how do you pay for all this?” It’s a sentiment many adults have looking at their own budgets – and one reason that some folks don’t even make a budget. It’s hard to face up to the realities of how much it costs to do what we want to do. When we design a computing solution, it’s interesting to set up a similar budget, because we don’t always consider all of the costs associated with it. I’ve seen design sessions where the new software or servers are considered, but the “sunk” costs of personnel, networking, maintenance, increased storage, new sizes for backups and offsite storage and so on are not added in. They are already on premises, so they are assumed to be paid for already. When you move to a distributed architecture, you'll see more costs directly reflected. Store something, pay for that storage. If the system is deployed and no one is using it, you’re still paying for it. As you watch those costs rise, you might be tempted to think that a distributed architecture costs more than an on-premises one. And you might be right – for some solutions. I’ve worked with a few clients where moving to a distributed architecture doesn’t make financial sense – so we didn’t implement it. I still designed the system in a distributed fashion, however, so that when it does make sense there isn’t much re-architecting to do. In other cases, however, if you consider all of the on-premises costs and compare those accurately to operating a system in the cloud, the distributed system is much cheaper. Again, I never recommend that you take a “here-or-there-only” mentality – I think a hybrid distributed system is usually best – but each solution is different. There simply is no “one size fits all” to architecting a solution. As you design your solution, cost out each element. You might find that using a hybrid approach saves you money in one design and not in another. It’s a brave new world indeed. So yes, in the cloud, everything costs money. But an on-premises solution also costs money – it’s just that “dad” (the company) is paying for it and we don’t always see it. When we go out on our own in the cloud, we need to ensure that we consider all of the costs.

    Read the article

  • Web developing- Strange happenings

    - by Jason
    As I'm teaching myself PHP and MySQL during break, I'm experimenting coding in a Ubuntu virtual machine where Apache, MySQL and PHP have been installed and configured to a shared folder. I'm not a big fan of Kompozer because the source code layout is a PIA, so I've started checking out gPHPEdit. However, since using it, I've come across two issues: when I edit the .html and .php files, sometimes the file extension will change to .html~ and .php~, becoming invisible to the browser. The only solution is to switch to Windows, right click and rename the file extension. In Ubuntu Firefox, when I click on my prpject's Submit button for in a practice form, a dialog box pops up asking what Firefox should do with the .php file, rather than simply displaying it in the browser. When I do this in Windows Chrome & Firefox, it goes right to the response page. I'm not sure if this behavior is limited to gPHPEdit/Kompozer, but I've never noticed this happening in Dreamweaver. Any solutions? EDIT The behavior in Point 1 occurs both when Dreamweaver is open in Windows accessing the same files and when it is not. I changed the extension filename of welcome.php, added a comment in gPHPEdit, and the file changed to welcome.php~ upon saving.

    Read the article

  • Web Deploy to IIS7 fails with 401 (Unauthorized)

    - by Trex
    we have IIS7 running on Windows Web Server 2008 R2 and it's set up to support Web Deploy. It worked fine when we used the default Administrator account. We recently disabled this account (for security reasons) and are now trying to deploy using another account which is member of the Administrators group, but the deploy fails with 401 (Unauthorized). More specifically, it says: Connected to '<IP>' using Web Deployment Agent Service, but could not authorize. Make sure you are an admin on '<IP>'. The remote server returned an error: (401) Unauthorized. Anybody has any ideas why this is happening? Thanks. Trex

    Read the article

  • Google Webmaster Tools, DNS Errors & HostPapa

    - by Gravy
    Received a message from Google Webmaster Tools: Over the last 24 hours, Googlebot encountered 2 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 40.0%. You can see more details about these errors in Webmaster Tools. Recommended action Contacted HostPapa and they deny that there is any issue with the site / server!!! Support in terms of what I can do to actually resolve this issue is non-existent!!!! The site is currently online. And I don't know much about DNS... so any advice about what I can do to resolve this problem would be much appreciated. Basically, the message from Google says that it is my webhosts fault, the message from my webhost (HostPapa) is... "Just tell google to crawl your site as there are no errors."

    Read the article

  • GWT: reporting crawling errors for non existing links

    - by pixeline
    Google Webmaster Tools is reporting crawl errors for links that never existed, and if i check the "Linked from" tab for a given error link, it shows another that never existed. They all mention joomla/ which is not the cms used on this domain (it's wordpress fyi). Exampled: http://example.com/joomla/index.php/component/user/register Linked from: http://example.com/joomla/component/user/login?return=L2###### What is going on? UPDATE 1 I tried something: I provided one of the faulty urls to the "Fetch as Google" functionality. Instead of returning a 404, it returns a 301 to another Joomla page. HTTP/1.1 301 Moved Permanently Server: Apache/2.4.3 X-Powered-By: PHP/5.4.4-10 X-Pingback: http://example.com/xmlrpc.php Expires: Wed, 11 Jan 1984 05:00:00 GMT Cache-Control: no-cache, must-revalidate, max-age=0 Pragma: no-cache Set-Cookie: PHPSESSID=1fgr5v2oip39miibuptd51s8h0; path=/ Set-Cookie: woocommerce_items_in_cart=0; expires=Sat, 12-Jan-2013 11:44:01 GMT; path=/ Location: http://example.com/joomla/component/user/register Content-Type: text/html; charset=iso-8859-1 Content-Length: 387 Date: Sat, 12 Jan 2013 12:44:01 GMT Via: 1.1 varnish Connection: keep-alive Accept-Ranges: bytes Age: 0 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="http://example.com/joomla/component/user/register">here</a>.</p> <p>Additionally, a 301 Moved Permanently error was encountered while trying to use an ErrorDocument to handle the request.</p> </body></html>

    Read the article

  • Is there a free-embedded SSH solution ?

    - by ereOn
    Hi, I'm working for an important company which has some severe network policies. I'd like to connect from my work, to my home linux server (mainly because it allows me to monitor my home-automated installation, but that's off-topic) but of course, any ssh connection (tcp port 22) to an external site is blocked. While I understand why this is done (to avoid ssh tunnels I guess), I really need to have some access to my box. (Well, "need" might be exagerated, but that would be nice ;) Do you know any web-based solution that I could install on my home linux server that would give me some pseudo-terminal (served using https) embedded in a web page ? I'm not necessarily looking for something graphical: a simple web-embedded ssh console would do the trick. Or do you guys see any other solution that wouldn't compromise network security ? Thank you very much for your solutions/advices.

    Read the article

  • Can AJAX in a CMS slow down your server

    - by Saif Bechan
    I am currently developing some plugins for WordPress, and I was wondering which route to take. Let's take an example, you want to display the last 3 tweets on your page. Option 1 You do things the normal way inside WordPress. Someone enters the website, while generating the page, you fetch the tweets in php via the twitter api, and just display them where you want. Now the small problem with this is, that you have to wait for the response from twitter. This takes a few ms. NO real problem, but this is question is just out of curiosity. Option 2 Here you don't do anything in WordPress on the initial load, but you do have the API inside. Now you just generate the page, and as soon as the page is done on the client side, you do a small AJAX call back to the server via a WordPress plugin, to fetch your latest tweets. Also called asynchronously. Now the problem with this IMO is that you have much more stress on your server. For starters you have two HTTP requests instead of one. Secondly the WordPress core has to load two times instead of one. Other options Now I know there are a lot of other options: 1) Getting the tweets directly via javascript, no stress on the server at all. 2) Cache the tweets so they are fetched from the DB instead of using the API every time. 3) Getting the tweets from an ajax call that is not a WordPress plugin. 4) Many more. My Question Now my question is if you only compare 1 and 2, which would be a better choice.

    Read the article

  • Godaddy VPS or swith to another provider?

    - by Charlie
    Long story short, I want to be able to have close to full control over my server, mainly being able to install things on my sever that Godaddy shared hosting does not allow (gzip, etc..). Should I switch providers to something else (all my domains are hosted there) or upgrade my hosting to VPS. If I upgrade, how hard will it be to set it up without their "assisted service" or something setup (where they do virus scanning, etc.)?

    Read the article

  • DNS - domain conflict?

    - by Stefanos.Ioannou
    I was given two domains: domain.com & domain.info (they are on GoDaddy). And I was also given two servers, 107.105.38.99 - Rails app and 107.107.90.17 - Wordpress platform, on Digital Ocean. At first, I was instructed to associate domain.com with the 107.107.38.99 (Rails app). Then I was instructed to de-associate this IP with domain.com and associated the 107.107.90.17 with the domain name domain.com. Then I was instructed to associated domain.info with the 107.107.38.99 (Rails app). Right now, when I go to domain.com the WordPress platform (107.107.90.17) loads fine and that is what is expected. But when I go to domain.info for the Rails app (107.107.38.99) I get redirected to domain.com. This is not expected and this is really weird for me. When I ping domain.info I get this: PING domain.info (107.107.38.99): 56 data bytes 64 bytes from 107.107.38.99: icmp_seq=0 ttl=50 time=74.601 ms Which is the expected result showing the correct IP but I don't understand why I get redirected to domain.com...(which when I ping is:) domain 64 bytes from 107.107.90.17: icmp_seq=0 ttl=50 time=75.057 ms The PTR Records on Digital Ocean are as follows: IP Address PTR Record 107.107.38.99 domain.info. 107.107.90.17 domain.com. and the DNS configurations on Digital Ocean are: domain.com A: @ 107.107.90.17 CNAME: * @ domain.info A: @ 107.107.38.99 CNAME: * @ I am not sure what the issue is, if you have any clue please let me know, I will be really grateful. If you need any other info let me know.

    Read the article

  • SharePoint 2010 Service Pack 1

    - by Ricardo Peres
    Install updates by this order: SharePoint Foundation 2010 SP1 SharePoint Foundation 2010 Language Packs SP1 SharePoint Server 2010 SP1 SharePoint Server 2010 SP1 Language Packs SP1 June 2011 Cumulative Update After installing any of these packages, run the SharePoint 2010 Products Configuration Wizard. Also, you may want to read this: Service Pack 1 (SP1) for Microsoft SharePoint Foundation 2010 and Microsoft SharePoint Server 2010 (white paper) Description of SharePoint Server 2010 SP1

    Read the article

< Previous Page | 223 224 225 226 227 228 229 230 231 232 233 234  | Next Page >