Search Results

Search found 17971 results on 719 pages for 'website'.

Page 506/719 | < Previous Page | 502 503 504 505 506 507 508 509 510 511 512 513  | Next Page >

  • Software development company business plan

    - by Navi
    I apologize in advance if this is the wrong forum for this question, so please forward me to the right place. I have about 10 years professional experience as software developer. Mostly on the Java platform doing server side programs. I have picked up a bit of Linux skills on the way as well. I know HTML and Javascript, so I can make a website that would not be too ugly, but I am not going to win any prizes with it. In fact I think I am pretty terrible in the user interface department. My initial plan is to do Android development. I read a few Android books and tried making a few apps. Since it is Java based I think I got the technical side down. Lately I have been thinking about iphone and Mac development, because of the relevant app store/development programs. The trouble is I don't know Objective C. As a side question, how long would it take me to become proficient in Objective C? Considering that I am working on my own and could hire somebody to help me for a short time for low wages if necessary what are my options? What are the pro and cons of the development programs app stores of Android and Apple? Which development/app stores are out there beside the ones I mentioned? Do you think it is necessary to find funds to get me started or should I just use my savings? If you have positive/negative experiences in a similar situations can you please share them? Thanks for your help.

    Read the article

  • Why is my iPhone WIFI so slow at home?

    - by John Fouhy
    When my iPhone is connected to my home wireless network, the internet is unusably slow. I installed the speedtest.net application; here are some results from tonight: Down: 0.0kB/s, Up: 0.0kB/s, ping: 2230ms Down: 2.5kB/s, Up: 40.5kB/s, ping: 2182ms Down: 0.0kB/s, Up: 20.0kB/s, ping: 197ms For comparison, here is the result from my iMac to the same server, which is on the same wireless network (and has no wired connection): Down: 139kB/s, Up: 53.8kB/s, ping: 182ms Neither my iMac nor the Dell laptop also on the network have experienced the wifi problems I get with my iPhone. On the other hand, I tried browsing a website on the wireless network at work with no problems. EDIT: SpeedTest at work gives me 156kB/s down. EDIT2: Girlfriend (owner of the Dell) reports actually the internet is sometimes very slow. Perhaps there is more going on. No problems with my iMac. My router is a ASUS WL-500g Premium V2 running OpenWrt Kamikaze with X-Wrt Extensions 8.09.

    Read the article

  • The advantages & disadvantages to be had from using a Web Framework?

    - by JHarley1
    Hello, This question is focused on extracting the advantages and disadvantages of using Web based Frameworks: such as Cake PHP, Zend, jQuery, ASP.NET). This question is completely language agnostic. Let me start with the notion of "Standing on the shoulders of Giants". Advantages: Empowers Developers - by taking features that would have previously have taken 100's of lines of code and compressing them into one simple function call empowers developers to integrate more complex features into their Web Sites. Allow for Quicker development of applications - this is very relevant for people that need websites created in a very small window (has anyone any examples of this?) Lower Costs - allows programmers to pass cost savings onto the customer, a whole new range of customers generated that wanted a website but previously could not afford the higher development costs. Disadvantages: Lost Understanding - by relying on the features of a framework a developer is in danger of loosing understanding on how things work (underneath the hood). The configuration cliff - once you go further than the configuration of your framework your productivity drops right off, it can be difficult to implement features outside of a frameworks configuration. Developer tramlines - you (the developer) has to do things the way that the developer want you to do things. Security issues - giving people these tools to develop professional looking websites fast is a potential risk, people can quickly create professional looking websites for fraudulent companies. I wonder what people make of my points, and whether any body disagrees with them? Also if people have additional points I would be grateful. Many Thanks, J

    Read the article

  • Accessing server by dedicated IP address

    - by Sherwin Flight
    I'm having an issue with my hosting provider after migrating to a new account. It's taking some time to get the problem sorted out, so I am hoping someone here can shed some light on the situation. The server is running WHM/cPanel, and the site I am trying to access has a dedicated IP address. When I connect to the server like this http://IP.HERE instead of showing my the website the way I would expect, it is showing the contents of a subfolder. So, while I would expect it to load public_html/ it is loading public_html/somefolder/ instead. Any idea why this is happening instead of showing the sites homepage the way I would expect? EDIT It is not redirecting, so the url is just http://IP.ADDRESS/, but the files listed are from a subfolder. So, it LOOKS at though I went to http://IP.ADDRESS/subfolder, when the URL says it should be showing the main folder contents. When I access the site using the domain name, it works properly, so I assume the document root is set correctly.

    Read the article

  • Deleting files using .NET that were migrated from win2k3

    - by Andrew Duncan
    We recently migrated an ASP.NET website from Windows 2003 to Windows 2008 R2, by zipping up all the files and extracting them to the new site. Since migrating the web application is still able to upload and delete files (that are new), however, it's unable to delete files that were copied from the original Win 2k3 app. We're guessing it's a permissions problem because the error is: Access to the path 'E:.......PATH.....' is denied. We've been trying to match the permissions of a newly uploaded file to that of a migrated files. Newly uploaded files seem to get the APP POOL user as a permission and the OWNER. However, the original files didn't have this. Any help that anyone can be would be fantastic. Thanks,

    Read the article

  • Recommended resource to understand Internet conventions IPs, CNAMES, *, MX etc

    - by Petras
    I am a programmer who has been creating websites for many years in shared hosting environments. To make the website live, I logged into where the domain was hosted and updated the name servers. Sometimes I didn’t want POP email so I changed an A record. I never really understood what this meant but it worked. Now we have a dedicated server and I have to fill out all this to make it live: Plus I was told I had to complete zones at my domain host: I would really like to know what all this means. What is a * record What is an @ How does the internet work regarding all these conventions? Is there a good approachable book on this topic?

    Read the article

  • Static pages for large photo album

    - by Phil P
    I'm looking for advice on software for managing a largish photo album for a website. 2000+ pictures, one-time drop (probably). I normally use MarginalHack's album, which does what I want: pre-generate thumbnails and HTML for the pictures, so I can serve without needing a dynamic run-time, so there's less attack surface to worry about. However, it doesn't handle pagination or the like, so it's unwieldy for this case. This is a one-time drop for pictures from a wedding, with a shared usercode/password for distribution to the guests; I don't wish to put the pictures in a third-party hosting environment. I don't wish to use PHP, simply because that's another run-time to worry about, I might relent and use something dynamic if it's Python or Perl based (as I can maintain things written in those). I currently have: Apache serving static files, Album-generated, some sub-directories to divide up the content to be a little more manageable. Something like Album but with pagination already handled would be great, but I'm willing to have something a little more dynamic, if it lets people comment or caption and store the extra data in something like an sqlite DB. I'd want something light-weight, not a full-blown CMS with security updates every three months. I don't want to upload pictures of other peoples' children into a third-party free service where I don't know what the revenue model is. (For my site: revenue is none, costs out of pocket). Existing server hosting is *nix, Apache, some WSGI. Client-side I have MacOS. Any advice?

    Read the article

  • Disabling default gestures in Scrybe

    - by RoboShop
    Just upgraded my touchpad drivers and it came with a Scrybe program which allows you to basically do some gesture and automatically load up a website or an application etc. Sounds like it could potentially be very useful. The only thing I don't really like about it is it comes preloaded with like all of these default gestures that take you to facebook, amazon, ebay etc. I'm sure they purposely put them there cause they get money for referels etc. but is there a way to turn them off? I would just like my own links in there. I just think that it takes probably about 2-3 seconds for a gesture to be recognized and that might be due largely to the fact that it's gotta compare it against all these default gestures. If I could somehow disable them, I'm sure it would work faster. Alternatively I'd be happy with a recommendation of a program similar to Scrybe but that works faster.

    Read the article

  • Automating the Backup of a SQL Server 2008 Express Database

    - by JaydPage
    Steps Involved: 1) Create a Database Backup Script. 2) Create a Scheduled Task To Run the Backup Script. 1 Create a Database Backup Script. a) Download and install SQL Server Management Studio. This is a free tool available on the Microsoft website. b) Once Management Studio is installed launch it and connect to the SQL server instance that contains the database that you want to back up. c) Right click on the database and then in the menu choose Tasks -> Back up... d) This will open up a window where you can choose your backup options, once you are happy with the options click on the "Script" button near the top and select the "Script Action to File" option. e) Save the File. 2 Create a Schedule Task to Run the Backup Script a) Open up Windows Task Scheduler. b) Create a new Task using the wizard, when asked to select a program browse to C:\Program Files\Microsoft SQL Server\100\Tools\binn\SQLCMD.exe c) There are 2 arguments that need to be set: -S \SERVER_INSTANCE_NAME  -i "PATH_OF_SQLBACKUP_SCRIPT" where SERVER_INSTANCE_NAME  is the name of the instance of SQL server that contains your database e.g. (local) and PATH_OF_SQLBACKUP_SCRIPT is the path of your backup script e.g. "C:\Program Files\Microsoft SQL Server\DatastoreBackup.sql" d) Adjust the task to run at the desired times and you are done.

    Read the article

  • Why are my USB 2.0 devices hanging Windows XP?

    - by BenAlabaster
    Background on the machine I'm having a problem with: The machine was inherited and appears to be circa 2003 (there's a date stamp on the power supply which leads me to this conclusion). I've got it set up as a Skype terminal for my 2 year old to keep in touch with her grandparents and other members of the family - which everyone loves. It has a generic ATX motherboard with no identifying markings other than one stamp that says "Rev.B". CPU-Z identifies the motherboard model as VT8601 but doesn't provide me with any manufacturer name. On board it has 1 x 10/100 LAN, 2 x USB 1.0, VGA, PS/2 for KB and mouse, parallel port, 2 x serial ports, 2 x IDE, 1 x floppy, 2 x SDRAM slots, 1 x CPU housing that is seating a 1.3GHz Intel Celeron CPU, 3 x PCI, 1 x AGP - although you can only use 2 of the PCI slots if you use the AGP slot due to the physical layout of the board. It's got 768Mb PC133 SDRAM - 1 x 512Mb & 1 x 256Mb installed as well as a D-LINK WDA-2320 54G Wi-Fi network card and a generic USB 2.0 expansion board containing 3 x external + 1 x internal USB connectors - it has a NEC uPD720102 chipset. It has a DVD+/-RW running as master on IDE1 and a 1.44Mb 3.5" floppy drive connected to the floppy connector. It has an 80Gb Western Digital hard drive running as master on IDE0. All this is sitting in a slimline case. I don't know the wattage of the PSU, but can post this later if this proves to be helpful. The motherboard is running a version of Award BIOS for which I don't have the version number to hand but can again post this later if it would be helpful. The hard disk is freshly formatted and built with Windows XP Professional/Service Pack 3 and is up to date with all current patches. In addition to Windows XP, the only other software it's running is Skype 4.1 (4.2 hangs the whole machine as soon as it starts up, requiring a hard boot to recover). It's got a Daytek MV150 15" touch screen hooked up to the on board VGA and COM1 sockets with the most current drivers from the Daytek website and the most current version of ELO-Touchsystems drivers for the touch component. The webcam is a Logitech Webcam C200 with the latest drivers from the Logitech website. The problem: If I hook any devices to the USB 2.0 sockets, it hangs the whole machine and I have to hard boot it to get it back up. If I have any devices attached to the USB 2.0 sockets when I boot up, it hangs before Windows gets to the login prompt and I have to hard boot it to recover. Workarounds found: I can plug the same devices into the on board USB 1.0 sockets and everything works fine, albeit at reduced performance. I've tried 3 different kinds of USB thumb drives, 3 different makes/models of webcams and my iPhone all with the same effect. They're recognized and don't hang the machine when I hook them to the USB 1.0 but if I hook them to the USB 2.0 ports, the machine hangs within a couple of seconds of recognizing the devices were connected. Attempted solutions: I've seen suggestions that this could be a power problem - that the PSU just doesn't have the wattage to drive these ports. While I'm doubtful this is the problem [after all the motherboard has the same standard connector regardless of the PSU wattage], I tried disabling all the on board devices that I'm not using - on board LAN, the second COM port, the AGP connector etc. through the BIOS in what I'm sure is a futile attempt to reduce the power consumption... I also modified the ACPI and power management settings. It didn't have any noticeable affect, although it didn't do any harm either. Could the wattage of the PSU really cause this problem? If it can, is there anything I need to be aware of when replacing it or do I just need to make sure it's got a higher wattage than the current one? My interpretation was that the wattage only affected the number of drives you could hook up to the power connectors, is that right? I've installed the USB card in another machine and it works without issue, so it's not a problem with the USB card itself, and Windows says the card is installed and working correctly... right up until I connect a device to it. The only thing I haven't done which I only just thought of while writing this essay is trying the USB 2.0 card in a different PCI slot, or re-ordering the wi-fi and USB cards in the slots... although I'm not sure if this will make any difference - does anyone have any experience that would suggest this might work? Other thoughts/questions: Perhaps this is an incompatibility between the USB 2.0 card and the BIOS, would re-flashing the BIOS with a newer version help? Do I need to be able to identify the manufacturer of the motherboard in order to be able to find a BIOS edition specific for this motherboard or will any version of Award BIOS function in its place? Question: Does anyone have any ideas that could help me get my USB 2.0 devices hooked up to this machine? Edit: Updated the USB 2.0 info with reference to actual card - http://www.xpcgear.com/lpnec4u.html

    Read the article

  • Compaq Q2009 monitor display Issues

    - by jasonco
    I have this compaq q2009 monitor 20' inches and I am having display issues. When I turn on the computer the image is displayed fine for a fraction of a second and then the display goes black. If I turn off the monitor and turn it on again the the same thing happens: the image is displayed for a fraction of a second and then goes black. This is odd. I've looked all over the web and I haven't found anything useful (not even in the HP website). So there is no "out of range" or "recommended resolution" message. It just displays and then goes black. I think the resolution is fine because it was working OK until recently and I haven't changed anything in that regard. Any help would be appreciated. Thanks.

    Read the article

  • How can I audit a Linux filesystem for files which have been changed or added within a specific time

    - by Bcos
    We are a website design/hosting company running several sites on a Linux server using Joomla 1.5.14 and recently someone was able exploit a vulnerability in the RW Cards component to write arbitrary files/modify existing files on our filesystem enabling them to do some nasty things to our customers sites. We have removed vulnerable modules from all sites but are still seeing some problems. We suspect that they still have some scripts installed and need a way to audit anything that has been changed or added in the last 10 days. Is there a command or script we can run to do this?

    Read the article

  • help with migrating from Widows, x64 FGLRX, CPU load, Java and Minecraft

    - by joxer
    Im new to ubuntu, it is the second time i have installed it. This comp is Dell studio 1558. some specs: CPU- intel core i7 Q720 1.6GHz, GPU- ATI Mobility Radeon HD 5400 FGLRX- i've fallowed these instructions among inspecting many others, i have tried all of the variants mentioned in that tread before reverting back to the drivers supplied with Ubuntu ( through additional drivers ) which apparently seem to work best. i am testing them with minecraft as silly as it may sound. in 2 to 60 minutes the FPS drop from 70+ to somewhere between 0 and 5. while "fgl_glxgears" runs at between 400 and 800 FPS smoothly.. I am using oracle ( sun ) JRE6 to run minecraft, i have gotten it through a tutorial linked on oracle's website, i currently have no other version of java installed ( was worse when i had a few others here ). after closing the game Ubuntu is similarly slow, i've checked the CPU load using System Monitor and it shows one of the CPU's jumping to 80%~100% load at a time.. a reboot solves it. i realize my mess is up to me to solve but a hand is always appreciated. tyvm in advance.

    Read the article

  • How do I convert a .vhd disk image to work with VMWare Fusion 2?

    - by Paul D. Waite
    I’ve just installed VMWare Fusion 2 on my Mac. Microsoft makes available some Virtual PC disk images containing different versions of IE, so that us humble web developers can test our code on them: http://www.microsoft.com/downloads/details.aspx?FamilyId=21EABB90-958F-4B64-B5F1-73D0A413C8EF&displaylang=en I want to convert these .vhd files to work with VMWare Fusion 2. Note: VMWare Fusion 3 can import .vhd files natively (File Import). This works just fine on the Microsoft IE compatibility VMs. I’ve tried VMWare Converter Standalone on Windows, but it doesn’t work with .vhd files (as of the current version, 4.0.1). Any ideas? VMWare’s website is confused corporate hell.

    Read the article

  • Catch-all DNS record

    - by Christian Sciberras
    Intro Our users have the ability to buy a domain (eg: user1.com) and make it point to out website, (eg: example.com), by simply pointing user1.com to ns1/ns2.example.com . Issue So far everything's good, however, example.com does not like this; we need to set up WHM/cpanel to make the server accept user1.com . Problem is, we'd rather made this automatic, possibly without having to use WHM API. The question We need some sort of "catch-all" wildcard entry so that we capture all of our user's possible domains.

    Read the article

  • C# Threading Background Process - Programming - How to?

    - by Magic
    Hello...I have been given the horrible task of doing this. Launch the website Take a screenshot Fill in the form details, click on Next Take a screenshot ... ... ... Rinse. Repeat. Now, with various combinations, this comes up to 300 screenshots. And I have to do this for 4 different browsers. Chrome, Firefox, IE 6 and IE 7. I cannot use tools which will capture the screenshot and store them, such as, SnagIT. I need to take a screenshot, copy it to a Word Document and take the second screenshot and take it to a Word Document. I thought, I will write a tiny utility which will help me do this. Here is the requirement spec that I put up for it - An executable which once launched seats itself in the System Tray. While it is active, all instances of Key Press (Print Scrn), it should write the contents to a Word Document as defined (either a default path or a user defined one). Save the document periodically. Now, my question is - if I am going to develop this using C# (Winforms application), how do I go about doing this. I can do a fair bit of C# programming and I am willing to learn. But I am not able to locate the references for how to do a background process so that it runs in the background. And while it runs, it has to capture the Print Scrn command. Can you folks point me to the right material where I can learn this? Theoretical references should suffice. But if there are practical references, then nothing like it. Thanks!

    Read the article

  • Doing a mysql dump causes swapping issues

    - by DFischer
    I do a mysqldump manually every night. I just noticed that after it is done and I try to access the website it is very slow. After I take a look at the free -mh I notice that the server is now swapping when it otherwise wasn't before the mysqldump. What am I to do in this case? Just restart the server every time I backup? That doesn't seem very effective. My database file raw is 1.1gb after the dump.

    Read the article

  • Centrally managing 100+ websites without bankrupting a small company

    - by palintropos
    I'm mainly interested in opinions on the trade-offs between having a single central server all the websites connect to as opposed to each website mirroring a subset of the master database with all the products in it. For example, will I run into severe performance issues (or even security issues, or restrictions) making queries to an offsite database? Will we hit scalability issues we can't handle early on from the sheer bandwidth required to maintain this? If we do go with something like a script that keeps smaller databases (each containing a subset of the central master data) in sync, what sorts of issues will we likely encounter there? I would really like the opinions of people far more knowledgeable than I am regarding the pros and cons of both setups and what headaches we are likely to encounter. CLARIFICATION: This should not be viewed as a question about whether we should implement one database vs multiple databases. This question has been answered numerous times. The question is regarding the pros and cons for a deployment like this having the ability to manage all the websites centrally (one server) vs trying to keep them all in sync if they each have their own db (multiple servers). REAL-WORLD EXAMPLE: We are a t-shirt company, and we have individual websites for our different kinds of t-shirts, but we're looking at a central order management integrated with our single shopping cart (which is ColdFusion + MySQL). Now, let's say we have a t-shirt that's on 10 of our websites and we change an image for it. Ideally we would change that in one place and the change would propagate, but how would we set this up?

    Read the article

  • Installing GPSBabel on CentOS 5 x86_64

    - by Clint Chaney
    Well first let me say I have no clue about doing anything on my server, I ask my host to do all installs for me. I run a website where users store latitude and longitude coordinates in my database. I would like them to be able to download these waypoints to their gps units. I found a program called GPSBabel that allows this to be done. http://www.gpsbabel.org/ I want to be able to control GPSBabel from PHP using exec() or something along those lines. The problem is that the linux version of the program is a source file and they don't want to build or install it without some source of instructions. Does anyone have experience with installing this? Perhaps know someone that has and that can lead me in the right direction? Any help would be hugely appreciated. I'm pretty much stuck without getting this to work.

    Read the article

  • centos 100% disk full - How to remove log files, history, etc?

    - by kopeklan
    mysqld won't start because disk space is full: 101221 14:06:50 [ERROR] /usr/libexec/mysqld: Error writing file '/var/run/mysqld/mysqld.pid' (Errcode: 28) 101221 14:06:50 [ERROR] Can't start server: can't create PID file: No space left on device running df -h: Filesystem Size Used Avail Use% Mounted on /dev/sda2 16G 3.2G 12G 23% / /dev/sda5 4.8G 4.6G 0 100% /var /dev/sda3 430G 855M 407G 1% /home /dev/sda1 76M 24M 49M 33% /boot tmpfs 956M 0 956M 0% /dev/shm du -sh * in /var: 12K account 56M cache 24K db 32K empty 8.0K games 1.5G lib 8.0K local 32K lock 221M log 16K lost+found 0 mail 24K named 8.0K nis 8.0K opt 8.0K preserve 8.0K racoon 292K run 70M spool 8.0K tmp 76K webmin 2.6G www 20K yp in /dev/sda5, there is website files in /var/www. because this is first time, I have no idea which files to remove other than moving /var/www to other partition And one more, what is the right way to remove log files, history, etc in /dev/sda5?

    Read the article

  • Tools to manage large network of heterogeneous web applications?

    - by Andrew
    I recently started a new job where I've been tasked with managing a global network of heterogenous web applications. There's very little documentation. My first order of business is to create an inventory of all of the web applications. Are there any tools out there to manage a large group of web apps? I'd like to collect a large dataset for each website including: logins for web based control panels logins to FTP/ssh accounts Google analytics tracking code for each site 3rd party libraries used SSL certs, issuers, and expiration dates etc I know I could keep the information in Excel or build a custom database, but I'm hoping there's already a tool out there to help me with this.

    Read the article

  • Centring an HTML element relative to its parent when its width is greater than its parent. [closed]

    - by casr
    I mocked up my intended outcome. So the blue element is the main content of the website and the yellow element represents something like a diagram or an image that has a greater width than the blue element. Ideally, I would like a purely CSS solution that is able to deal with various sizes of images. I have tried various things but have failed so far. I hope you can help! Here’s some example markup to set you on your way. <!DOCTYPE HTML> <html> <head> <title>Example</title> <style> #el1 { display: block; margin: 0 auto; width: 30em; background-color: #8cabde } #el2 { /* works when the width is less than the parent */ display: block; margin: 0 auto; } </style> </head> <body> <article id=el1> <p>Some content above.</p> <img id=el2 src=http://i.imgur.com/JFfGG.gif title=spaceball width=600 height=400> <p>Some content below.</p> </article> </body> </html>

    Read the article

  • Rewrite for robots.txt and favicon.ico [closed]

    - by BHare
    I have setup some rules in which subdomains (my users) will default to where I have located the robots.txt, favicon.ico, and crossdomain.xml therefore if a user creates a site say testing.mywebsite.com and they don't make their own favicon.ico at testing.mywebsite.com/favicon.ico, then it will use the favicon.ico I have in /misc/favicon.ico This works perfect, but it doesn't work for the main website. If you attempt to go to mywebsite.com/favicon.ico it will check if "/" exists, in which it does. And then never redirects to /misc/favicon.ico How can I get it so both instances redirect to /misc/favicon.ico ? # Set all crossdomain (openpalace file) favorite icons and robots.txt doesnt exist on their # side, then redirect to site's just to have something to go on. RewriteCond %{REQUEST_URI} crossdomain.xml$ RewriteCond ^(.+)crossdomain.xml !-f RewriteRule ^(.*)$ /misc/crossdomain.xml [L] RewriteCond %{REQUEST_URI} favicon.ico$ RewriteCond ^(.+)favicon.ico !-f RewriteRule ^(.*)$ /misc/favicon.ico [L] RewriteCond %{REQUEST_URI} robots.txt$ RewriteCond ^(.+)robots.txt !-f RewriteRule ^(.*)$ /misc/robots.txt [L]

    Read the article

  • Need help with ColdFusion and ASP.NET site [closed]

    - by Michael Stone
    To begin, I wasn't too sure how to title this.. I've got a few questions. First off, I've got a very big site that's in ColdFusion and we've been migrating to ASP.NET C# 4.0 the last 8 months. I've got a team of 7 programmers and no one can seem to figure out these answers, not even our senior C# programmer. We're using Team Foundation Server and we can't figure out how to only push up one small change at time. Right now we're stuck to publishing the entire site and it's causing serious issues. We've currently got the site as a Project and not a Website. We're wondering if that's one issue. I actually think it might be a problem. We're also dealing with an issue where we can't access our regular folders with relative paths. So we're first developing our admin side in .NET and We've got our regular site and then we've got another site within that for our .NET admin tools. By site, I'm referring to them actually being Sites in IIS. This also creates a problem for us when we're creating tools that upload images and want to store them and access them from our parent Site. I'd very much appreciate any advice on how to go about this in the most standardized way. So what I'm hoping for is advise on: -Publishing and managing a site/project in Team Foundation Server. Being able to push up one fix at a time if needed would be GREAT! -Any help figuring out the issuing referencing folders from my .NET child site to my parent ColdFusion site using regular relative paths. "/a/images/b/" would be nice nice instead of only being able to do "/b/images/" We're using ColdFusion 8, C# Asp.NET 4.0/Entity Framework/POCO Templates, and a Windows 2008 R2 Server. Thank you in advance for any help.

    Read the article

  • First request too slow even if I have a load balancer in the back

    - by adrian7
    I have an Apache 2 on Centos + bind with a wordpress website on it (e.g example.com). I have also set up, on another server in a different contry a load balancer (varnish:80 + nginx 127.0.0.1:8080) for it - which task is to server all static content under /wp-content/. Using Simple DNS editor I added an A entry to cdn.example.com pointing to the server's IP. So no extra work from a 2nd dns server. Then using htaccess I redirect all requests to jpg|gif|css|js files to cdn.example.com. That works and all files are saved on the "cdn" server and served right away. My problem is that for the first time I enter on example.com (e.g after restarting the computer or closing the browser) the load time is 1 up to 3 seconds, while any subsequent page loads take only 300 to 600 miliseconds. I know it might be a DNS issue, but I have done a cache check on several websites and cdn.example.com indicates the right IP. Do you have any ideas where I should dig to solve this first-time slowness?

    Read the article

< Previous Page | 502 503 504 505 506 507 508 509 510 511 512 513  | Next Page >