Search Results

Search found 47903 results on 1917 pages for 'test driven development'.

Page 26/1917 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Is there a way to undo Mocha stubbing of any_instance in Test::Unit

    - by Craig Walker
    Much like this question, I too am using Ryan Bates's nifty_scaffold. It has the desirable aspect of using Mocha's any_instance method to force an "invalid" state in model objects buried behind the controller. Unlike the question I linked to, I'm not using RSpec, but Test::Unit. That means that the two RSpec-centric solutions there won't work for me. Is there a general (ie: works with Test::Unit) way to remove the any_instance stubbing? I believe that it's causing a bug in my tests, and I'd like to verify that.

    Read the article

  • at what point in the test cycle are the test results written to a file?

    - by jcollum
    I'd like to take the entirety of the test results and publish it to a database. Got the database, got the table, got the script to publish it. Question is, at what point in the ms-test cycle would the results be fully written to the file? And how can I get the path to that file? I'd especially like to grab the "TextMessages" node and put it into my database. I assumed AssemblyCleanup, but the TestContext doesn't seem to be available then.

    Read the article

  • Resources for setting up a Visual Studio/C++ development environment

    - by Tom H.
    I haven't done much "front-end" development in about 15 years since moving to database development. I'm planning to start work on a personal project using C++ and since I already have MSDN I'll probably end up doing it in Visual Studio 2010. I'm thinking about using Subversion as a version control system eventually. Of course, I'd like to get up and running as quickly as I can, but I'd also like to avoid any pitfalls from a poorly organized project environment. So, my question is, are there any good resources with common best practices for setting up a development environment? I'm thinking along the lines of where to break down a solution into multiple projects if necessary, how to set up a unit testing process, organizing resources, directories, etc. Are there any great add-ons that I should make sure I have set up from the start? Most tutorials just have one simple project, type in your code and click on build to see that your new application says, "Hello World!". This will be a Windows application with several DLLs as well (no web development), so there doesn't need to be a deploy to a web server kind of process. Mostly I just want to make sure that I don't miss anything big and then have to extensively refactor because of it. Thanks!

    Read the article

  • "rake test" doesn't load fixtures?

    - by Pavel K.
    when i run rake test --trace here's what happens ** Invoke test (first_time) ** Execute test ** Invoke test:units (first_time) ** Invoke db:test:prepare (first_time) ** Invoke db:abort_if_pending_migrations (first_time) ** Invoke environment (first_time) ** Execute environment ** Execute db:abort_if_pending_migrations ** Execute db:test:prepare ** Invoke db:test:load (first_time) ** Invoke db:test:purge (first_time) ** Invoke environment ** Execute db:test:purge ** Execute db:test:load ** Invoke db:schema:load (first_time) ** Invoke environment ** Execute db:schema:load ** Execute test:units /usr/bin/ruby1.8 -I"lib:test".... (and after that fails because there's no fixtures loaded) why doesn't it load fixtures (i thought that would be default behaviour) and how do i make it load fixtures before executing tests??? p.s. my test/test_helper.rb content is: ENV["RAILS_ENV"] = "test" require File.expand_path(File.dirname(__FILE__) + "/../config/environment") require 'test_help' class ActiveSupport::TestCase self.use_transactional_fixtures = true self.use_instantiated_fixtures = false fixtures :all end (rails 2.3.4)

    Read the article

  • How can I use that?

    - by user289220
    test test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test testtest test test

    Read the article

  • reference list for non-IT driven algorithmic patterns

    - by Quicker
    I am looking for a reference list for non-IT driven algorithmic patterns (which still can be helped with IT implementations of IT). An Example List would be: name; short desc; reference Travelling Salesman; find the shortest possible route on a multiple target path; http://en.wikipedia.org/wiki/Travelling_salesman_problem Ressource Disposition (aka Regulation); Distribute a limited/exceeding input on a given number output receivers based on distribution rules; http://database-programmer.blogspot.de/2010/12/critical-analysis-of-algorithm-sproc.html If there is no such list, but you instantly think of something specific, please 'put it on the desk'. Maybe I can compile something out of the input I get here (actually I am very frustrated as I did not find any such list via research by myself). Details on Scoping: I found it very hard to formulate what I want in a way everything is out that I do not need (which may be the issue why I did not find anything at google). There is a database centric definition for what I am looking for in the section 'Processes' of the second example link. That somehow fits, but the database focus sort of drifts away from the pattern thinking, which I have in mind. So here are my own thoughts around what's in and what's out: I am NOT looking for a foundational algo ref list, which is implemented as basis for any programming language. Eg. the php reference describes substr and strlen. That implements algos, but is not what I am looking for. the problem the algo does address would even exist, if there were no computers (or other IT components) the main focus of the algo is NOT to help other algo's chances are high, that there are implementions of the solution or any workaround without any IT support out there in the world however the algo could be benefitialy implemented/fully supported by a software application = means: the problem of the algo has to be addressed anyway, but running an algo implementation with software automates the process (that is why I posted it on stackoverflow and not somewhere else) typically such algo implementations have more than one input field value and more than one output field value - which implies it could not be implemented as simple function (which is fixed to produce not more than one output value) in a normalized data model often times such algo implementation outputs span accross multiple rows (sometimes multiple tables), whereby the number of output rows depends on the input paraters and rows in the table(s) at start time - which implies that any algo implementation/procedure must interact with a database (read and/or write) I am mainly looking for patterns, not for specific implementations. Example: The Travelling Salesman assumes any coordinates, however it does not say: You need a table targets with fields x and y. - however sometimes descriptions are focussed on examples with specific implementations very much - no worries, as long as the pattern gets clear

    Read the article

  • Remote iPhone / xCode application development?

    - by ANE
    4 java developers are new to iPod Touch/iPhone app development. They have an idea for an app. They have never used Xcode or Macs before. Instead of spending money for a new iMac or Mac Mini for each of them, my boss would like to sell them a $999 Apple server, hosted at a facility connected a single T1 line, and have all 4 people work remotely in Xcode. Is this feasible? Is anyone doing anything like this? Specifically, is 1 T1 enough for realistic remote app development? Would they have to work in black & white via Logmein or Gotomeeting to get decent speed? Can four people work remotely together on an Xcode project at the same time? Do they absolutely need their own Macs to connect their iPod Touches or iPhones physically to, or can they connect to their existing PCs with iTunes and install their in-development apps that way?

    Read the article

  • Languages and development methodologies

    - by Carlos
    Having never worked with Ruby on Rails, I looked it up on Wikipedia. It says It is intended to be used with an Agile development methodology that is used by web developers for rapid development. This got me asking how a given language/framework can be more appropriate for given development methodologies. Are there certain languages that are more friendly for pair programming, for instance? Are there language features that make certain methodologies are more appropriate? Are there features that make certain methodologies impossible? My initial reaction is to dismiss the connection (the design process is a business process, which is more dependent on business needs that language features). But I'm an only programmer within the firm, and I'm a partner, so I get to decide the business needs. What do you think? Also, if the SO community finds that certain languages point towards certain methodologies, what methodology is most common for c#, which is what I use most of the time?

    Read the article

  • Ideal dev/test/QA environment for development

    - by Nick
    I am working to rebuild my company's dev/test/QA environment. We have 10-15 programmers that are involved in a number of projects. They currently all develop locally on their PCs and use the dev environment for testing. We currently do not have a QA environment, so deployments are frequently a pain because bugs are usually found after something has gone live. Here's what I envision: Doing away with everyone's local admin privileges and making everyone develop on a dev server Create a QA environment that is identical to our production systems. This will allow them to test deployments. Create a new test environment that is more locked down than the dev server so that proper testing can be done. What are your thoughts? What is the best way to set up an environment like this? We develop ASP .NET applications using MS Visual Studio 2008 (if that helps).

    Read the article

  • Minimum Hardware requirements for Android development

    - by vishwanath
    I need information about minimum hardware requirement I need to have better experience in developing Android application. My current configuration is as follows. P4 3.0 GHz, 512 MB of ram. Started with Hello Android development on my machine and experience was sluggish, was using Eclipse Helios for development. Emulator used to take lot of time to start. And running program too. Do I need to upgrade my machine for the development purpose or is there anything else I am missing on my machine(like heavy processing by some other application I might have installed). And If I do need to upgrade, do I need to upgrade my processor too(that counts to new machine actually, which I am not in favor of), or only upgrading RAM will suffice.

    Read the article

  • Managing Team Development on Shared Website

    - by stjowa
    I need to know the best way to manage team web-development on a shared server (hostgator). I have done some individual web development on a shared server in the past, and I have always setup SVN through SSH to have a pretty-nice development workflow (version control, quick-commits, work though eclipse/subclipse, etc). However, I also know that with that setup, I had to make some pretty-sophisticated post-commit hooks to export the repository to /public_html; and, therefore, making the repository code testable. This seems like a tedious and error-prone setup for an entire team. I would like to be able to: Easily test the latest code in the repository. Somewhat easily move the code in the repository to production. Use an IDE like eclipse/subclipse to easily work with the repository. With this in mind, does anyone know of a good version-control/repository setup for developing a website with a team of about 4-5 people? Thanks a lot.

    Read the article

  • The Utilization of Software Engineering Development Principles

    - by Chance
    Being a CS student I've had to take a course in basic software engineering. I was a little curious to find such elaborate "software development processes", like the spiral model, the waterfall model, et cetera. Some of these methodologies seem a little antiquated to me and, after speaking with several employed developers, I can't seem to find anyone who actually adheres to these models. Does anyone here have experience working under the guidance of these models? Were they useful to you and your team during the development of your product? Or are these models just some way to communicate a sense of progression to interested parties outside of the development team?

    Read the article

  • Devising a test strategy

    - by Simon Callan
    As part of a new job, I have to devise and implement a complete test strategy for the companies new product. So far, all I really know about it is that it is written in C++, uses an SQL database and has a web API which is used by a browser client written using GWT. As far as I know, there isn't much of an existing strategy, except for using Python scripts to test the web API. I need to develop and implement a suitable strategy for unit, system, regression and release testing, preferably a fully automated one. I'm looking for good references for : Devising the complete test strategy. Testing the web API. Testing the GWT based application. Unit testing C++ code. In addition, any suitable tools would be appreciate

    Read the article

  • Most common software development mistakes

    - by hgulyan
    Inspired by Dealing with personal failure, I remembered my own failed software development experience. Finally I agreed to rewrite existing application. It took me less than a week to rewrite existing app and more up to 2 months to write from zero my own. That 2 months were really hard and interesting. It was my first big software development process. I researched almost everything concerning to my application. Read Code Complete. Even some articles on how to create user interface. Some psychology stuff. Typography, Colors. DAL, DB Structure, SOA, Patterns, UML, Load testing etc. I hope, that after a month or 2 I would get opportunity to continue working on my failed project, but before that, I would like to ask: What are common mistakes in software development? What you shouldn't do in any case?

    Read the article

  • Development enviroment for .NET

    - by user137348
    I'm about to create an development enviroment for a little developer company(3-4 developers + some testers). Our development platform is .NET and Oracle. My question is how to structure the whole enviroment.How many servers do I need ? Should be one server for developers and one for testers ? I'd like to have one build server (TeamCity). Where to put the Subversion ? Visual Studio's would be on developers laptops. Do I need one database for development and one extra for build server ? What else could be helpful ??

    Read the article

  • Guide to reduce TFS database growth using the Test Attachment Cleaner

    - by terje
    Recently there has been several reports on TFS databases growing too fast and growing too big.  Notable this has been observed when one has started to use more features of the Testing system.  Also, the TFS 2010 handles test results differently from TFS 2008, and this leads to more data stored in the TFS databases. As a consequence of this there has been released some tools to remove unneeded data in the database, and also some fixes to correct for bugs which has been found and corrected during this process.  Further some preventive practices and maintenance rules should be adopted. A lot of people have blogged about this, among these are: Anu’s very important blog post here describes both the problem and solutions to handle it.  She describes both the Test Attachment Cleaner tool, and also some QFE/CU releases to fix some underlying bugs which prevented the tool from being fully effective. Brian Harry’s blog post here describes the problem too This forum thread describes the problem with some solution hints. Ravi Shanker’s blog post here describes best practices on solving this (TBP) Grant Holidays blogpost here describes strategies to use the Test Attachment Cleaner both to detect space problems and how to rectify them.   The problem can be divided into the following areas: Publishing of test results from builds Publishing of manual test results and their attachments in particular Publishing of deployment binaries for use during a test run Bugs in SQL server preventing total cleanup of data (All the published data above is published into the TFS database as attachments.) The test results will include all data being collected during the run.  Some of this data can grow rather large, like IntelliTrace logs and video recordings.   Also the pushing of binaries which happen for automated test runs, including tests run during a build using code coverage which will include all the files in the deployment folder, contributes a lot to the size of the attached data.   In order to handle this systematically, I have set up a 3-stage process: Find out if you have a database space issue Set up your TFS server to minimize potential database issues If you have the “problem”, clean up the database and otherwise keep it clean   Analyze the data Are your database( s) growing ?  Are unused test results growing out of proportion ? To find out about this you need to query your TFS database for some of the information, and use the Test Attachment Cleaner (TAC) to obtain some  more detailed information. If you don’t have too many databases you can use the SQL Server reports from within the Management Studio to analyze the database and table sizes. Or, you can use a set of queries . I find queries often faster to use because I can tweak them the way I want them.  But be aware that these queries are non-documented and non-supported and may change when the product team wants to change them. If you have multiple Project Collections, find out which might have problems: (Disclaimer: The queries below work on TFS 2010. They will not work on Dev-11, since the table structure have been changed.  I will try to update them for Dev-11 when it is released.) Open a SQL Management Studio session onto the SQL Server where you have your TFS Databases. Use the query below to find the Project Collection databases and their sizes, in descending size order.  use master select DB_NAME(database_id) AS DBName, (size/128) SizeInMB FROM sys.master_files where type=0 and substring(db_name(database_id),1,4)='Tfs_' and DB_NAME(database_id)<>'Tfs_Configuration' order by size desc Doing this on one of our SQL servers gives the following results: It is pretty easy to see on which collection to start the work   Find out which tables are possibly too large Keep a special watch out for the Tfs_Attachment table. Use the script at the bottom of Grant’s blog to find the table sizes in descending size order. In our case we got this result: From Grant’s blog we learnt that the tbl_Content is in the Version Control category, so the major only big issue we have here is the tbl_AttachmentContent.   Find out which team projects have possibly too large attachments In order to use the TAC to find and eventually delete attachment data we need to find out which team projects have these attachments. The team project is a required parameter to the TAC. Use the following query to find this, replace the collection database name with whatever applies in your case:   use Tfs_DefaultCollection select p.projectname, sum(a.compressedlength)/1024/1024 as sizeInMB from dbo.tbl_Attachment as a inner join tbl_testrun as tr on a.testrunid=tr.testrunid inner join tbl_project as p on p.projectid=tr.projectid group by p.projectname order by sum(a.compressedlength) desc In our case we got this result (had to remove some names), out of more than 100 team projects accumulated over quite some years: As can be seen here it is pretty obvious the “Byggtjeneste – Projects” are the main team project to take care of, with the ones on lines 2-4 as the next ones.  Check which attachment types takes up the most space It can be nice to know which attachment types takes up the space, so run the following query: use Tfs_DefaultCollection select a.attachmenttype, sum(a.compressedlength)/1024/1024 as sizeInMB from dbo.tbl_Attachment as a inner join tbl_testrun as tr on a.testrunid=tr.testrunid inner join tbl_project as p on p.projectid=tr.projectid group by a.attachmenttype order by sum(a.compressedlength) desc We then got this result: From this it is pretty obvious that the problem here is the binary files, as also mentioned in Anu’s blog. Check which file types, by their extension, takes up the most space Run the following query use Tfs_DefaultCollection select SUBSTRING(filename,len(filename)-CHARINDEX('.',REVERSE(filename))+2,999)as Extension, sum(compressedlength)/1024 as SizeInKB from tbl_Attachment group by SUBSTRING(filename,len(filename)-CHARINDEX('.',REVERSE(filename))+2,999) order by sum(compressedlength) desc This gives a result like this:   Now you should have collected enough information to tell you what to do – if you got to do something, and some of the information you need in order to set up your TAC settings file, both for a cleanup and for scheduled maintenance later.    Get your TFS server and environment properly set up Even if you have got the problem or if have yet not got the problem, you should ensure the TFS server is set up so that the risk of getting into this problem is minimized.  To ensure this you should install the following set of updates and components. The assumption is that your TFS Server is at SP1 level. Install the QFE for KB2608743 – which also contains detailed instructions on its use, download from here. The QFE changes the default settings to not upload deployed binaries, which are used in automated test runs. Binaries will still be uploaded if: Code coverage is enabled in the test settings. You change the UploadDeploymentItem to true in the testsettings file. Be aware that this might be reset back to false by another user which haven't installed this QFE. The hotfix should be installed to The build servers (the build agents) The machine hosting the Test Controller Local development computers (Visual Studio) Local test computers (MTM) It is not required to install it to the TFS Server, test agents or the build controller – it has no effect on these programs. If you use the SQL Server 2008 R2 you should also install the CU 10 (or later).  This CU fixes a potential problem of hanging “ghost” files.  This seems to happen only in certain trigger situations, but to ensure it doesn’t bite you, it is better to make sure this CU is installed. There is no such CU for SQL Server 2008 pre-R2 Work around:  If you suspect hanging ghost files, they can be – with some mental effort, deduced from the ghost counters using the following SQL query: use master SELECT DB_NAME(database_id) as 'database',OBJECT_NAME(object_id) as 'objectname', index_type_desc,ghost_record_count,version_ghost_record_count,record_count,avg_record_size_in_bytes FROM sys.dm_db_index_physical_stats (DB_ID(N'<DatabaseName>'), OBJECT_ID(N'<TableName>'), NULL, NULL , 'DETAILED') The problem is a stalled ghost cleanup process.  Restarting the SQL server after having stopped all components that depends on it, like the TFS Server and SPS services – that is all applications that connect to the SQL server. Then restart the SQL server, and finally start up all dependent processes again.  (I would guess a complete server reboot would do the trick too.) After this the ghost cleanup process will run properly again. The fix will come in the next CU cycle for SQL Server R2 SP1.  The R2 pre-SP1 and R2 SP1 have separate maintenance cycles, and are maintained individually. Each have its own set of CU’s. When it comes I will add the link here to that CU. The "hanging ghost file” issue came up after one have run the TAC, and deleted enourmes amount of data.  The SQL Server can get into this hanging state (without the QFE) in certain cases due to this. And of course, install and set up the Test Attachment Cleaner command line power tool.  This should be done following some guidelines from Ravi Shanker: “When you run TAC, ensure that you are deleting small chunks of data at regular intervals (say run TAC every night at 3AM to delete data that is between age 730 to 731 days) – this will ensure that small amounts of data are being deleted and SQL ghosted record cleanup can catch up with the number of deletes performed. “ This rule minimizes the risk of the ghosted hang problem to occur, and further makes it easier for the SQL server ghosting process to work smoothly. “Run DBCC SHRINKDB post the ghosted records are cleaned up to physically reclaim the space on the file system” This is the last step in a 3 step process of removing SQL server data. First they are logically deleted. Then they are cleaned out by the ghosting process, and finally removed using the shrinkdb command. Cleaning out the attachments The TAC is run from the command line using a set of parameters and controlled by a settingsfile.  The parameters point out a server uri including the team project collection and also point at a specific team project. So in order to run this for multiple team projects regularly one has to set up a script to run the TAC multiple times, once for each team project.  When you install the TAC there is a very useful readme file in the same directory. When the deployment binaries are published to the TFS server, ALL items are published up from the deployment folder. That often means much more files than you would assume are necessary. This is a brute force technique. It works, but you need to take care when cleaning up. Grant has shown how their settings file looks in his blog post, removing all attachments older than 180 days , as long as there are no active workitems connected to them. This setting can be useful to clean out all items, both in a clean-up once operation, and in a general There are two scenarios we need to consider: Cleaning up an existing overgrown database Maintaining a server to avoid an overgrown database using scheduled TAC   1. Cleaning up a database which has grown too big due to these attachments. This job is a “Once” job.  We do this once and then move on to make sure it won’t happen again, by taking the actions in 2) below.  In this scenario you should only consider the large files. Your goal should be to simply reduce the size, and don’t bother about  the smaller stuff. That can be left a scheduled TAC cleanup ( 2 below). Here you can use a very general settings file, and just remove the large attachments, or you can choose to remove any old items.  Grant’s settings file is an example of the last one.  A settings file to remove only large attachments could look like this: <!-- Scenario : Remove large files --> <DeletionCriteria> <TestRun /> <Attachment> <SizeInMB GreaterThan="10" /> </Attachment> </DeletionCriteria> Or like this: If you want only to remove dll’s and pdb’s about that size, add an Extensions-section.  Without that section, all extensions will be deleted. <!-- Scenario : Remove large files of type dll's and pdb's --> <DeletionCriteria> <TestRun /> <Attachment> <SizeInMB GreaterThan="10" /> <Extensions> <Include value="dll" /> <Include value="pdb" /> </Extensions> </Attachment> </DeletionCriteria> Before you start up your scheduled maintenance, you should clear out all older items. 2. Scheduled maintenance using the TAC If you run a schedule every night, and remove old items, and also remove them in small batches.  It is important to run this often, like every night, in order to keep the number of deleted items low. That way the SQL ghost process works better. One approach could be to delete all items older than some number of days, let’s say 180 days. This could be combined with restricting it to keep attachments with active or resolved bugs.  Doing this every night ensures that only small amounts of data is deleted. <!-- Scenario : Remove old items except if they have active or resolved bugs --> <DeletionCriteria> <TestRun> <AgeInDays OlderThan="180" /> </TestRun> <Attachment /> <LinkedBugs> <Exclude state="Active" /> <Exclude state="Resolved"/> </LinkedBugs> </DeletionCriteria> In my experience there are projects which are left with active or resolved workitems, akthough no further work is done.  It can be wise to have a cleanup process with no restrictions on linked bugs at all. Note that you then have to remove the whole LinkedBugs section. A approach which could work better here is to do a two step approach, use the schedule above to with no LinkedBugs as a sweeper cleaning task taking away all data older than you could care about.  Then have another scheduled TAC task to take out more specifically attachments that you are not likely to use. This task could be much more specific, and based on your analysis clean out what you know is troublesome data. <!-- Scenario : Remove specific files early --> <DeletionCriteria> <TestRun > <AgeInDays OlderThan="30" /> </TestRun> <Attachment> <SizeInMB GreaterThan="10" /> <Extensions> <Include value="iTrace"/> <Include value="dll"/> <Include value="pdb"/> <Include value="wmv"/> </Extensions> </Attachment> <LinkedBugs> <Exclude state="Active" /> <Exclude state="Resolved" /> </LinkedBugs> </DeletionCriteria> The readme document for the TAC says that it recognizes “internal” extensions, but it does recognize any extension. To run the tool do the following command: tcmpt attachmentcleanup /collection:your_tfs_collection_url /teamproject:your_team_project /settingsfile:path_to_settingsfile /outputfile:%temp%/teamproject.tcmpt.log /mode:delete   Shrinking the database You could run a shrink database command after the TAC has run in cases where there are a lot of data being deleted.  In this case you SHOULD do it, to free up all that space.  But, after the shrink operation you should do a rebuild indexes, since the shrink operation will leave the database in a very fragmented state, which will reduce performance. Note that you need to rebuild indexes, reorganizing is not enough. For smaller amounts of data you should NOT shrink the database, since the data will be reused by the SQL server when it need to add more records.  In fact, it is regarded as a bad practice to shrink the database regularly.  So on a daily maintenance schedule you should NOT shrink the database. To shrink the database you do a DBCC SHRINKDATABASE command, and then follow up with a DBCC INDEXDEFRAG afterwards.  I find the easiest way to do this is to create a SQL Maintenance plan including the Shrink Database Task and the Rebuild Index Task and just execute it when you need to do this.

    Read the article

  • How to test a 3D rendering engine?

    - by YoYo
    Me and some friends developing simple 3D rendering engine as practice for the university. We used Ogre 3d as prototype and now we are developing it from base The engine is wrapped up in simple game that asks the user to select shape (circle, triangle, square...), color and dimensions and renders the image to the screen. It also enables to move and rotate the shape on screen using mouse. We would like to test automate the view rendering. I could not find any test framework for this issue and I would like to know how 3D test is done in non manual matter

    Read the article

  • vim does not preserve symlink over sshfs

    - by HighCommander4
    I'm having some trouble with symlinks and sshfs. I use the '-o follow_symlinks' option to follow symlinks on the server side, but whenever I edit a symlinked file on the client side with vim, a copy of it is made on the server side, i.e. it's no longer a symlink. Set up a symlink on the server side: me@machine1:~$ echo foo > test.txt me@machine1:~$ mkdir test me@machine1:~$ cd test me@machine1:~/test$ ln -s ../test.txt test.txt me@machine1:~/test$ ls -al test.txt lrwxrwxrwx 1 me me 11 Jan 5 21:13 test.txt -> ../test.txt me@machine1:~/test$ cat test.txt foo me@machine1:~/test$ cat ../test.txt foo So far so good. Now: me@machine2:~$ mkdir test me@machine2:~$ sshfs me@machine1:test test -o follow_symlinks me@machine2:~$ cd test me@machine2:~/test$ vim test.txt [in vim, add a new line "bar" to the file] me@machine2:~/test$ cat test.txt foo bar Now observe what this does to the file on the server side: me@machine1:~/test$ ls -al test.txt -rw-r--r-- 1 me me 19 Jan 5 21:24 test.txt me@machine1:~/test$ cat test.txt foo bar me@machine1:~/test$ cat ../test.txt foo As you can see, it made a copy and only edited the copy. How can I get it to work so it actually follows the symlink when editing the file?

    Read the article

  • What management/development practices do you change when a team of 1-3 developers grows to 10+?

    - by Mag20
    My team built a website for a client several years ago. The site taffic has been growing very quickly and our client has been asking us to grow our team to fill their maintenance and feature request needs. We started with a small number of developers, and our team has grown - now we're in the double digits. What management/development changes are the most beneficial when team grows from small "garage-size" team to 10+ developers?

    Read the article

  • Teaching "web design/development" to high-school home-school group. Good sources?

    - by anonymous coward
    I may soon begin teaching a "web design and development" class for a home-school co-op group. Any suggestions for "course" material? My first thought was to work through the Opera Web Standards Curriculum, but am interested in hearing about possible alternatives or suggestions that better cover the "very basics" of getting started with designing and developing web pages. Not necessarily looking for topics, so much as existing resources. Thanks so much for your input!

    Read the article

  • Are Java servers really more preferable for web-development? [closed]

    - by Gerald Goward
    Many experienced people I know tell me that many web-projects, including enterprise ones, are better to develop with Java being back-end. Reasons: Ubuntu servers being cheaper and more reliable. MySql being much more "light" rather than "heavyweighted" MS Sql. I heard some more, but I really cant remember all of them. My question: I believe ASP.NET and Java are BOTH good for web-development and its just holywar subject. Am I right or not?

    Read the article

  • Career Development: What should I learn next after Python? and Why? [closed]

    - by Josh
    Hi all I'm currently learning Python. I want to know what should I learn next out of these programming langauages: PHP Actionscript 3 Objective-C (iPhone applications) I work in the Multimedia industry and have decided to learn Python as a first programming language seriously because I would like to learn the basics of programming, to mainly write scripts at work that Automate task (eg. Edit multiple XML files quickly) At work we have a senior developer who knows Actionscript and PHP very well (although knows PHP better). We also have been developing iPhone applications for 2 weeks, Our senior developer could learn it although we have lots of work currently with PHP and Actionscript 3 type work and haven't had time or reason to pick up iOS development. Here are the reasons I want to learn each language, But I cannot decide what I'll learn next: PHP: I want to learn PHP because it will help with Web Development. PHP is very wanted by employers. Senior developer at work writes everything in it web sites, CMS etc. (including XML checks and scripts), I will learn a lot from him (once I learn the basics). However, I don't want to learn Web because you have to deal with lots of cross-browser problems. Actionscript 3: At work we are looking to put on another developer to help with online activities and very small games (using Actionscript 3.0 and Flash CS5) for (eg. First Aid Activities etc) I would like to do things that have a element of design as I'm better at Photoshop then developing. I want to be creative, I like to interact with users in a fun way. Objective-C (iPhone applications): We are a all mac office, we may get more iPhone, iPad application work(jobs) that need to be created. Work has found it nearly impossible to find good iPhone developers. I like apple products (Macs and iPhones), I would like to make my own games, applications in my spare time(if I knew how). Should I learn Actionscript first because it would be easier to learn then Objective-C? Should I learn PHP because it is very widely used? Should I learn Objective-C because it is really wanted by employers now?

    Read the article

  • When and why are certain data structures used in the context of web development?

    - by Ein Doofus
    While browsing around the MSDN I came across: http://msdn.microsoft.com/en-us/library/aa287104%28v=vs.71%29 which lists various data structures such as: Queues Stacks Hashtables Binary Trees Binary Search Trees Graphs (I believe there are also Lists) and I was hoping to get a high-level overview of when these various data structures can be used in the broad context of web development, and when used, why one data structure is generally used instead of any other one.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >