Search Results

Search found 60836 results on 2434 pages for 'system io directory'.

Page 508/2434 | < Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >

  • What is the correct way to backup ZODB blobs?

    - by joeforker
    I am using plone.app.blob to store large ZODB objects in a blobstorage directory. This reduces size pressure on Data.fs but I have not been able to find any advice on backing up this data. I am already backing up Data.fs by pointing a network backup tool at a directory of repozo backups. Should I simply point that tool at the blobstorage directory to backup my blobs? What if the database is being repacked or blobs are being added and deleted while the copy is taking place? Are there files in the blobstorage directory that must be copied over in a certain order?

    Read the article

  • Issue converting Sitecore Item[] using ToList<T>

    - by philba888
    Working with Sitecore and Linq extensions. I am trying to convert to from an item array to the list using the following piece of code: Item variationsFolder = masterDB.SelectSingleItem(VariationsFolderID.ToString()); List<Item> variationList = variationsFolder.GetChildren().ToList<Item>(); However I keep getting this error whenever I try to build: 'Sitecore.Collections.ChildList' does not contain a definition for 'ToList' and the best extension method overload 'System.Linq.Enumerable.ToList<TSource>(System.Collections.Generic.IEnumerable<TSource>)' has some invalid arguments I have the following usings: using System.Linq; using System.Xml.Linq; Am referencing: System.Core I've just copied this code from another location, so it should work fine, can only think that there is something simple (like a reference or something that I am missing).

    Read the article

  • gwt maven war plugin configuration problem

    - by Din
    I am developing a gwt application in maven. In this I am using maven war plugin. Everything works fine. When I give mvn install command it builds abc.war file in target folder. But it is not copying compiled javascript files ("module1" and "module2" directories present in target) to war directory. I want to get newly compiled javascript files in war directory. How to achieve this? pom.xml file <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>example</groupId> <artifactId>example</artifactId> <packaging>war</packaging> <version>12</version> <name>gwt-maven-archetype-project</name> <properties> <!-- convenience to define GWT version in one place --> <gwt.version>2.1.0</gwt.version> <noServer>false</noServer> <skipTest>true</skipTest> <gwt.localWorkers>1</gwt.localWorkers> <JAVA_HOME>C:\Program Files\Java\jdk1.6.0_22</JAVA_HOME> <!-- convenience to define Spring version in one place --> </properties> <dependencies> <!-- Required dependencies--> </dependencies> <build> <finalName>abc</finalName> <outputDirectory>war/WEB-INF/classes</outputDirectory> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <verbose>true</verbose> <executable>${JAVA_HOME}\bin\java.exe</executable> <compilerVersion>1.6</compilerVersion> <source>1.6</source> <target>1.6</target> </configuration> </plugin> <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>gwt-maven-plugin</artifactId> <version>2.1.0</version> <executions> <execution> <goals> <goal>compile</goal> <goal>generateAsync</goal> <goal>mergewebxml</goal> <goal>test</goal> </goals> </execution> </executions> <configuration> <servicePattern>**/client/**/*Service.java</servicePattern> <noServer>${noServer}</noServer> <noserver>${noServer}</noserver> <modules> <module>com.abc.example.Module1</module> <module>com.abc.example.Module2</module> </modules> <runTarget>com.abc.example.Module1/module1.jsp</runTarget> <port>8080</port> <extraJvmArgs>-Xmx1024m -Xms1024m -Xss1024k -Dgwt.jjs.permutationWorkerFactory=com.google.gwt.dev.ThreadedPermutationWorkerFactory</extraJvmArgs> <hostedWebapp>war</hostedWebapp> <warSourceDirectory>${basedir}/war</warSourceDirectory> <webXml>${basedir}/war/WEB-INF/web.xml</webXml> </configuration> </plugin> <plugin> <artifactId>maven-antrun-plugin</artifactId> <executions> <execution> <phase>process-classes</phase> <configuration> </configuration> <goals> <goal>run</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-war-plugin</artifactId> <version>2.1-beta-1</version> <configuration> <warSourceDirectory>${basedir}/war</warSourceDirectory> <webXml>${basedir}/war/WEB-INF/web.xml</webXml> <!--<webXml>src/main/webapp/WEB-INF/web.xml</webXml>--> <containerConfigXML>war/WEB-INF/classes/context/context.xml</containerConfigXML> <warSourceExcludes>.gwt-tmp/**</warSourceExcludes> </configuration> </plugin> <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>cobertura-maven-plugin</artifactId> <executions> <execution> <goals> <goal>clean</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <version>2.4.2</version> <configuration> <argLine>-Xmx1024m</argLine> <skipTests>${skipTest}</skipTests> </configuration> </plugin> <plugin> <artifactId>maven-clean-plugin</artifactId> <version>2.2</version> <configuration> <filesets> <fileset> <directory>war/module1</directory> </fileset> <fileset> <directory>war/module2</directory> </fileset> <fileset> <directory>war/WEB-INF/lib</directory> </fileset> </filesets> </configuration> </plugin> </plugins> <resources> <resource> <directory>src/main/resources</directory> <excludes> <exclude>**/public/resources/**</exclude> <exclude>**/public/images/**</exclude> </excludes> <filtering>true</filtering> </resource> </resources> <filters> <filter>src/main/resources/build/build-${env}.properties</filter> </filters> </build> <profiles> <profile> <activation> <activeByDefault>true</activeByDefault> </activation> <id>dev</id> <properties> <env>dev</env> </properties> </profile> </profiles> <reporting> <plugins> <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>cobertura-maven-plugin</artifactId> </plugin> </plugins> </reporting>

    Read the article

  • CruiseControl Console error message when parsing XML

    - by jarad
    I have CruiseControl(1.5) running in Win2k8R2 and svn(1.6.9) The error happens on a successful build after nant(0.86) Timeout(600 seconds). When I check the build dir everything is built correctly but CruiseControl Dashboard report Exception Here is the error shown in console: [:DEBUG] Exception: System.Xml.XmlException: The ',' character, hexadecimal value 0x2C, cannot be included in a name. Line 5544, position 274. at System.Xml.XmlTextReaderImpl.Throw(Exception e) at System.Xml.XmlTextReaderImpl.ParseElement() at System.Xml.XmlTextReaderImpl.ParseDocumentContent() at System.Xml.XmlWriter.WriteNode(XmlReader reader, Boolean defattr) at ThoughtWorks.CruiseControl.Core.Util.XmlFragmentWriter.WriteNode(XmlReader reader, Boolean defattr) at ThoughtWorks.CruiseControl.Core.Util.XmlFragmentWriter.WriteNode(String xml)

    Read the article

  • Simple, non-blocking way to sleep?

    - by OverTheRainbow
    Hello I googled for this and read some threads here, but I haven't found a simple way to have a VB.Net application sleep for a little while and still keep the application responsive: Imports System.Net Imports System.IO Imports System.Text Imports System.Text.RegularExpressions Imports System.Threading.Thread [...] 'How to keep screen frop freezing? While True ListBox1.Items.Clear() ListBox1.Items.Add("blah") 'Not much difference ListBox1.Refresh() 'Wait 1mn Sleep(60000) End While Is there really no simple, non-blocking solution to have a VB.Net application wait for a few seconds? Thank you.

    Read the article

  • How many repositories should I use to maintain my scripts under version control?

    - by romandas
    I mainly code small programs for myself, but recently, I've been starting to code for my peers on my team. To that end, I've started using a Mercurial repository to maintain my code in some form of version control (specifically, Tortoise-Hg on Windows). I have many small scripts, each in their own directory, all under one repository. However, while reading Joel's Hg Tutorial, I tried cloning a directory for one of my bigger scripts to create a "stable" version and found I couldn't do it because the directory wasn't itself a repository. So, I assume (and please correct me if I'm mistaken) that in order to use cloning properly, I'd have to create a repository for each script/directory. But.. would that be a "good idea" or a future maintenance nightmare waiting to happen? Succinctly, do I keep all my (unrelated) scripts in one repository, or should I create a repository for each? Or some unknown third option?

    Read the article

  • Describe the Damas-Milner type inference in a way that a CS101 student can understand

    - by user128807
    Hindley-Milner is a type system that is the basis of the type systems of many well known functional programming languages. Damas-Milner is an algorithm that infers (deduces?) types in a Hindley-Milner type system. Wikipedia gives a description of the algorithm which, as far as I can tell, amounts to a single word: "unification." Is that all there is to it? If so, that means that the interesting part is the type system itself not the type inference system. If Damas-Milner is more than unification, I would like a description of Damas-Milner that includes a simple example and, ideally, some code. Also, this algorithm is often said to do type inference. Is it really an inference system? I thought it was only deducing the types. Related questions: What is Hindley Miller? Type inference to unification problem

    Read the article

  • Symlinking (ln) faster than moving (mv)?

    - by Chad Johnson
    When we build web software releases, we prepare the release in a temporary directory and then replace the release directory with the temporary one just prepared: # Move and replace existing release directory. mv /path/to/httpdocs /path/to/httpdocs.before mv /path/to/$newReleaseName /path/to/httpdocs Under this scheme, it happens that with about 1 in every 15 releases, a user was using a file in the original release directory exactly at the time the commands above are run, and a fatal error occurs for that user. I am wondering if using symlinking like follows would be significantly faster, in terms of processing time, thereby helping to lessen the likelihood of this problem: # Remove and replace existing release symlink. ln -sf /path/to/$newReleaseName path/to/httpdocs

    Read the article

  • What does the "ApplicaionDirectory" Membership condition mean in .NET Code Access Security?

    - by smwikipedia
    I am not sure about the semantic of "ApplicationDirectory" membership condition. I am trying to use it in the .NET Framework 2.0 configuration tool. The tool's explanation to it is as below: The Application Directory membership condition is true for all assemblies in the same directory or in a child directory of the running application. Assemblies that meet this membership condition will be granted the permissions associated with this code group. All the other membership conditions such as strong name, hash, need me to input some criterias, only the Application Directory has not. How to use it? Could someone give an explanation by example? Many thanks.

    Read the article

  • Create Folders from text file and place dummy file in them using a CustomAction

    - by Birkoff
    I want my msi installer to generate a set of folders in a particular location and put a dummy file in each directory. Currently I have the following CustomActions: <CustomAction Id="SMC_SetPathToCmd" Property="Cmd" Value="[SystemFolder]cmd.exe"/> <CustomAction Id="SMC_GenerateMovieFolders" Property="Cmd" ExeCommand="for /f &quot;tokens=* delims= &quot; %a in ([MBSAMPLECOLLECTIONS]movies.txt) do (echo %a)" /> <CustomAction Id="SMC_CopyDummyMedia" Property="Cmd" ExeCommand="for /f &quot;tokens=* delims= &quot; %a in ([MBSAMPLECOLLECTIONS]movies.txt) do (copy [MBSAMPLECOLLECTIONS]dummy.avi &quot;%a&quot;\&quot;%a&quot;.avi)" /> These are called in the InstallExecuteSequence: <Custom Action="SMC_SetPathToCmd" After="InstallFinalize"/> <Custom Action="SMC_GenerateMovieFolders" After="SMC_SetPathToCmd"/> <Custom Action="SMC_CopyDummyMedia" After="SMC_GenerateMovieFolders"/> The custom actions seem to start, but only a blank command prompt window is shown and the directories are not generated. The files needed for the customaction are copied to the correct directory: <Directory Id="WIX_DIR_COMMON_VIDEO"> <Directory Id="MBSAMPLECOLLECTIONS" Name="MB Sample Collections" /> </Directory> <DirectoryRef Id="MBSAMPLECOLLECTIONS"> <Component Id="SampleCollections" Guid="C481566D-4CA8-4b10-B08D-EF29ACDC10B5" DiskId="1"> <File Id="movies.txt" Name="movies.txt" Source="SampleCollections\movies.txt" Checksum="no" /> <File Id="series.txt" Name="series.txt" Source="SampleCollections\series.txt" Checksum="no" /> <File Id="dummy.avi" Name="dummy.avi" Source="SampleCollections\dummy.avi" Checksum="no" /> </Component> </DirectoryRef> What's wrong with these Custom Actions or is there a simpler way to do this?

    Read the article

  • Ubuntu btrfs: how to remove rootflags=subvol=@ from grub.cfg

    - by mnpria
    When i mount "btrfs" as a root filesytem, the mount info is as below: root@ubuntu1304Btrfs:~# mount /dev/mapper/ubuntu1304Btrfs--vg-root on / type btrfs (rw,subvol=@) Is there a way to have a mount info without the "subvol" information ? I have tried executing what was mentioned here. I also updated the grub.cfg. Still rootflags=subvol=@ is not removed. Is there a way to remove this subvol information ? root@ubuntu1304Btrfs:/home# mount /dev/mapper/ubuntu1304Btrfs--vg-root on / type btrfs (rw,subvol=@) /dev/mapper/ubuntu1304Btrfs--vg-root on /home type btrfs (rw,subvol=@home) root@ubuntu1304Btrfs:/# stat / File: ‘/’ Size: 262 Blocks: 0 IO Block: 4096 directory Device: 12h/18d Inode: 256 Links: 1 Access: (0755/drwxr-xr-x) Uid: ( 0/ root) Gid: ( 0/ root) Access: 2013-11-11 19:56:04.548121873 +0530 Modify: 2013-11-11 19:55:18.008120103 +0530 Change: 2013-11-11 19:55:18.008120103 +0530 Birth: - root@ubuntu1304Btrfs:/# stat /home/ File: ‘/home/’ Size: 230 Blocks: 0 IO Block: 4096 directory Device: 19h/25d Inode: 256 Links: 1 Access: (0755/drwxr-xr-x) Uid: ( 0/ root) Gid: ( 0/ root) Access: 2013-11-12 12:24:52.346377976 +0530 Modify: 2013-11-12 12:24:50.338377900 +0530 Change: 2013-11-12 12:24:50.338377900 +0530 Birth: -

    Read the article

  • bjam wih visual studio 2010

    - by ra170
    ok, so I ran into problems with Boost under visual studio 2010, so I decided to rebuild it with bjam: such as: bjam --toolset=msvc-10.0 --build-type=complete After running bjam (successfully?) it created a new directory under boost_1_42_0 called: bin.v2 Inside bin.v2 is directory called: libs. Two issues: 1. there's lot less libs under that new directory (about 13), the old directory libs has 88. Is it supposed to be like that or did something fail? 2. the structure is somewhat different too. What do I do with this exactly? Meaning, do I copy it over to the original libs, delete the old libs, try rebulding it with different flags?

    Read the article

  • Java compiler error: "cannot find symbol" when trying to access local variable

    - by HH
    $ javac GetAllDirs.java GetAllDirs.java:16: cannot find symbol symbol : variable checkFile location: class GetAllDirs System.out.println(checkFile.getName()); ^ 1 error $ cat GetAllDirs.java import java.util.*; import java.io.*; public class GetAllDirs { public void getAllDirs(File file) { if(file.isDirectory()){ System.out.println(file.getName()); File checkFile = new File(file.getCanonicalPath()); }else if(file.isFile()){ System.out.println(file.getName()); File checkFile = new File(file.getParent()); }else{ // checkFile should get Initialized at least HERE! File checkFile = file; } System.out.println(file.getName()); // WHY ERROR HERE: checkfile not found System.out.println(checkFile.getName()); } public static void main(String[] args) { GetAllDirs dirs = new GetAllDirs(); File current = new File("."); dirs.getAllDirs(current); } }

    Read the article

  • Is wordpress a good platform for webapp development?

    - by darlinton
    I am planning a webapp to control bank paying orders. In a quick review, the user goes online and creates a payment order. This order goes to other people that pays it and register the payment on the system. The system keeps track of all the payments, keeping the account balance up-to-date. The system needs a login system, bank integration, and to support at some point thousands of clients. We can find articles on the web about the benefits of using wordpress platform to build webapps. However, I could not find discussion with counterarguments to user wordpress. As the platform the most important choice in webapp project, I would to know more about the pitfalls and harms for choosing wordpress. The question is: What are the benefits and harms for choosing wordpress as a development platform for a webapp that need to be integrated with other system (backend systems) and to handle thousands of users (does it scale up?)?

    Read the article

  • bash : recursive listing of all files problem

    - by Michael Mao
    Run a recursive listing of all the files in /var/log and redirect standard output to a file called lsout.txt in your home directory. Complete this question WITHOUT leaving your home directory. An: ls -R /var/log/ /home/bqiu/lsout.txt I reckon the above bash command is not correct. This is what I've got so far: $ ls -1R .: cal.sh cokemachine.sh dir sort test.sh ./dir: afile.txt file subdir ./dir/subdir: $ ls -R | sed s/^.*://g cal.sh cokemachine.sh dir sort test.sh afile.txt file subdir But this still leaves all directory/sub-directory names (dir and subdir), plus a couple of empty newlines How could I get the correct result without using Perl or awk? Preferably using only basic bash commands(this is just because Perl and awk is out of assessment scope)

    Read the article

  • Java: how does isDirectory and isFile work in File-class?

    - by HH
    $ javac GetAllDirs.java GetAllDirs.java:16: cannot find symbol symbol : variable checkFile location: class GetAllDirs System.out.println(checkFile.getName()); ^ 1 error $ cat GetAllDirs.java import java.util.*; import java.io.*; public class GetAllDirs { public void getAllDirs(File file) { if(file.isDirectory()){ System.out.println(file.getName()); File checkFile = new File(file.getCanonicalPath()); }else if(file.isFile()){ System.out.println(file.getName()); File checkFile = new File(file.getParent()); }else{ // checkFile should get Initialized at least HERE! File checkFile = file; } System.out.println(file.getName()); // WHY ERROR HERE: checkfile not found System.out.println(checkFile.getName()); } public static void main(String[] args) { GetAllDirs dirs = new GetAllDirs(); File current = new File("."); dirs.getAllDirs(current); } }

    Read the article

  • Oracle Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    - by Francisco Quiñones
    Hello, I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window. I made the follow export : expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename and it works, I can see the file TABLESDUMP.DMP in the directory path. then when I tried to import it to a sql file: impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql the log show : ..... ORA-31655 no data or metadata objects selected for job ..... and the sql file is created empty in the directory path. I'm not DBA, I'm a Java developer , Can you help me? Thks

    Read the article

  • Using rvm with a standalone ruby script

    - by John Yeates
    I have rvm installed on a Mac OS X 10.6 system with the system ruby and 1.9.1. I also have this basic ruby script: #!/usr/bin/ruby require 'curb-fu' I need the script to use the system ruby regardless of what rvm's using at any given time; I'm assuming that I've got that right, at least. I've switched to the system ruby (rvm use system) and then installed the gem (gem install curb-fu). If I run irb and type require 'curb-fu', it works. However, running that script with ./myscript.rb fails: /Users/me/bin/podcast_notify.rb:6:in `require': no such file to load -- curb-fu (LoadError) from /Users/me/bin/podcast_notify.rb:6 What's going wrong here? How do I install curb-fu so that it's always available to this script?

    Read the article

  • How to improve my LDAP schema?

    - by asmaier
    Hello, I have a OpenLDAP Database and it holds some project objects that look like dn: cn=Proj1,ou=Project,ou=ua,dc=org cn: Proj1 objectClass: top objectClass: posixGroup member: 001ag member: 002ag System: ABEL System: PCx Budget: ABEL:1000000:0.3 Budget: PCx:300000:0.3 One can see that the Budget attribute is a ":"-separated string, where the first part holds the name of the system the budget is for, the second part holds some budget (which may change every month) and the last entry is a conversion factor for the budget of that system. Seeing this, I thought this is bad database design, since attribute values should always be atomic. But how can I improve that in LDAP, so that I can do a direct ldapsearch or a direct ldapmodify of the budget of System "ABEL" instead of writing a script, that will have to parse and split the ":"-separated string?

    Read the article

  • DRY and SRP

    - by Timothy Klenke
    Originally posted on: http://geekswithblogs.net/TimothyK/archive/2014/06/11/dry-and-srp.aspxKent Beck’s XP Simplicity Rules (aka Four Rules of Simple Design) are a prioritized list of rules that when applied to your code generally yield a great design.  As you’ll see from the above link the list has slightly evolved over time.  I find today they are usually listed as: All Tests Pass Don’t Repeat Yourself (DRY) Express Intent Minimalistic These are prioritized.  If your code doesn’t work (rule 1) then everything else is forfeit.  Go back to rule one and get the code working before worrying about anything else. Over the years the community have debated whether the priority of rules 2 and 3 should be reversed.  Some say a little duplication in the code is OK as long as it helps express intent.  I’ve debated it myself.  This recent post got me thinking about this again, hence this post.   I don’t think it is fair to compare “Expressing Intent” against “DRY”.  This is a comparison of apples to oranges.  “Expressing Intent” is a principal of code quality.  “Repeating Yourself” is a code smell.  A code smell is merely an indicator that there might be something wrong with the code.  It takes further investigation to determine if a violation of an underlying principal of code quality has actually occurred. For example “using nouns for method names”, “using verbs for property names”, or “using Booleans for parameters” are all code smells that indicate that code probably isn’t doing a good job at expressing intent.  They are usually very good indicators.  But what principle is the code smell of Duplication pointing to and how good of an indicator is it? Duplication in the code base is bad for a couple reasons.  If you need to make a change and that needs to be made in a number of locations it is difficult to know if you have caught all of them.  This can lead to bugs if/when one of those locations is overlooked.  By refactoring the code to remove all duplication there will be left with only one place to change, thereby eliminating this problem. With most projects the code becomes the single source of truth for a project.  If a production code base is inconsistent with a five year old requirements or design document the production code that people are currently living with is usually declared as the current reality (or truth).  Requirement or design documents at this age in a project life cycle are usually of little value. Although comparing production code to external documentation is usually straight forward, duplication within the code base muddles this declaration of truth.  When code is duplicated small discrepancies will creep in between the two copies over time.  The question then becomes which copy is correct?  As different factions debate how the software should work, trust in the software and the team behind it erodes. The code smell of Duplication points to a violation of the “Single Source of Truth” principle.  Let me define that as: A stakeholder’s requirement for a software change should never cause more than one class to change. Violation of the Single Source of Truth principle will always result in duplication in the code.  However, the inverse is not always true.  Duplication in the code does not necessarily indicate that there is a violation of the Single Source of Truth principle. To illustrate this, let’s look at a retail system where the system will (1) send a transaction to a bank and (2) print a receipt for the customer.  Although these are two separate features of the system, they are closely related.  The reason for printing the receipt is usually to provide an audit trail back to the bank transaction.  Both features use the same data:  amount charged, account number, transaction date, customer name, retail store name, and etcetera.  Because both features use much of the same data, there is likely to be a lot of duplication between them.  This duplication can be removed by making both features use the same data access layer. Then start coming the divergent requirements.  The receipt stakeholder wants a change so that the account number has the last few digits masked out to protect the customer’s privacy.  That can be solve with a small IF statement whilst still eliminating all duplication in the system.  Then the bank wants to take a picture of the customer as well as capture their signature and/or PIN number for enhanced security.  Then the receipt owner wants to pull data from a completely different system to report the customer’s loyalty program point total. After a while you realize that the two stakeholders have somewhat similar, but ultimately different responsibilities.  They have their own reasons for pulling the data access layer in different directions.  Then it dawns on you, the Single Responsibility Principle: There should never be more than one reason for a class to change. In this example we have two stakeholders giving two separate reasons for the data access class to change.  It is clear violation of the Single Responsibility Principle.  That’s a problem because it can often lead the project owner pitting the two stakeholders against each other in a vein attempt to get them to work out a mutual single source of truth.  But that doesn’t exist.  There are two completely valid truths that the developers need to support.  How is this to be supported and honour the Single Responsibility Principle?  The solution is to duplicate the data access layer and let each stakeholder control their own copy. The Single Source of Truth and Single Responsibility Principles are very closely related.  SST tells you when to remove duplication; SRP tells you when to introduce it.  They may seem to be fighting each other, but really they are not.  The key is to clearly identify the different responsibilities (or sources of truth) over a system.  Sometimes there is a single person with that responsibility, other times there are many.  This can be especially difficult if the same person has dual responsibilities.  They might not even realize they are wearing multiple hats. In my opinion Single Source of Truth should be listed as the second rule of simple design with Express Intent at number three.  Investigation of the DRY code smell should yield to the proper application SST, without violating SRP.  When necessary leave duplication in the system and let the class names express the different people that are responsible for controlling them.  Knowing all the people with responsibilities over a system is the higher priority because you’ll need to know this before you can express it.  Although it may be a code smell when there is duplication in the code, it does not necessarily mean that the coder has chosen to be expressive over DRY or that the code is bad.

    Read the article

  • Users being forced to re-login randomly, before session and auth ticket timeout values are reached

    - by Don
    I'm having reports and complaints from my user that they will be using a screen and get kicked back to the login screen immediately on their next request. It doesn't happen all the time but randomly. After looking at the Web server the error that shows up in the application event log is: Event code: 4005 Event message: Forms authentication failed for the request. Reason: The ticket supplied has expired. Everything that I read starts out with people asking about web gardens or load balancing. We are not using either of those. We're a single Windows 2003 (32-bit OS, 64-bit hardware) Server with IIS6. This is the only website on this server too. This behavior does not generate any application exceptions or visible issues to the user. They just get booted back to the login screen and are forced to login. As you can imagine this is extremely annoying and counter-productive for our users. Here's what I have set in my web.config for the application in the root: <authentication mode="Forms"> <forms name=".TcaNet" protection="All" timeout="40" loginUrl="~/Login.aspx" defaultUrl="~/MyHome.aspx" path="/" slidingExpiration="true" requireSSL="false" /> </authentication> I have also read that if you have some locations setup that no longer exist or are bogus you could have issues. My path attributes are all valid directories so that shouldn't be the problem: <location path="js"> <system.web> <authorization> <allow users="*" /> </authorization> </system.web> </location> <location path="images"> <system.web> <authorization> <allow users="*" /> </authorization> </system.web> </location> <location path="anon"> <system.web> <authorization> <allow users="*" /> </authorization> </system.web> </location> <location path="App_Themes"> <system.web> <authorization> <allow users="*" /> </authorization> </system.web> </location> <location path="NonSSL"> <system.web> <authorization> <allow users="*" /> </authorization> </system.web> </location> The only thing I'm not clear on is if my timeout value in the forms property for the auth ticket has to be the same as my session timeout value (defined in the app's configuration in IIS). I've read some things that say you should have the authentication timeout shorter (40) than the session timeout (45) to avoid possible complications. Either way we have users that get kicked to the login screen a minute or two after their last action. So the session definitely should not be expiring. Update 2/23/09: I've since set the session timeout and authentication ticket timeout values to both be 45 and the problem still seems to be happening. The only other web.config in the application is in 1 virtual directory that hosts Community Server. That web.config's authentication settings are as follows: <authentication mode="Forms"> <forms name=".TcaNet" protection="All" timeout="40" loginUrl="~/Login.aspx" defaultUrl="~/MyHome.aspx" path="/" slidingExpiration="true" requireSSL="true" /> </authentication> And while I don't believe it applies unless you're in a web garden, I have both of the machine key values set in both web.config files to be the same (removed for convenience): <machineKey validationKey="<MYVALIDATIONKEYHERE>" decryptionKey="<MYDECRYPTIONKEYHERE>" validation="SHA1" /> <machineKey validationKey="<MYVALIDATIONKEYHERE>" decryptionKey="<MYDECRYPTIONKEYHERE>" validation="SHA1"/> Any help with this would be greatly appreciated. This seems to be one of those problems that yields a ton of Google results, none of which seem to be fitting into my situation so far.

    Read the article

  • How easily recognized are new TLDs?

    - by Ryan Muller
    I'm interested in purchasing a domain name for a new service I intend to market. I know that .com is instantly recognizable as a domain ending, and if I see stackoverflow.com I know it's a web address. However, I also recognize strings like github.io and mysite.tk as domains, since I've worked with domains like these. To the average member of the public, if one sees an address ending in .io or similar, non-mainstream TLD (e.g. on a billboard or business card) would they immediately know it's a URL and to type it into a browser? Or are these new domains only useful 1) for a technical audience or 2) when you will be primarily promoting your site through links and not print?

    Read the article

  • how to create a watcher using fsevents in mac osx 10.6

    - by mathan
    I m trying to get file event notifications using fsevents.h file. I m working with Mac OS X 10.6 and XCode 3.1.4 in which i found fsevents.h in four following locations /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/Headers/FSEvents.h /Xcode3.1.4/SDKs/MacOSX10.5.sdk/System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/Headers /Developer/SDKs/MacOSX10.5.sdk/System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/Headers /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/Headers I have following issues in accessing fsevents.h 1) Out of above four locations which one should be included since fsevents is not getting included unless i put following include syntax include<../../../../Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/Headers/fsevents.h 2) Where could I find the function definition whose prototypes are declared in fsevents.h using "extern" keyword

    Read the article

  • Blank Screen at boot Ubuntu 12.04 - nvidia-current - Macbook Air 3,2

    - by soulnafein
    I've installed nvidia-current using the Additional Drivers application in Ubuntu 12.04. I need those drivers so I can use accelerated WebGL. After installing the drivers, and rebooting X fails to start and I have a frozen system/dark screen. Below is the content of Xorg.0.log How can I fix this problem? [ 4.666] X.Org X Server 1.11.3 Release Date: 2011-12-16 [ 4.666] X Protocol Version 11, Revision 0 [ 4.666] Build Operating System: Linux 2.6.42-23-generic x86_64 Ubuntu [ 4.666] Current Operating System: Linux david-macbook-air 3.2.0-34-generic #53-Ubuntu SMP Thu Nov 15 10:48:16 UTC 2012 x86_64 [ 4.666] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.2.0-34-generic root=UUID=b3d5ae2a-72af-4ef9-b775-0d40b5f80f9b ro quiet splash vt.handoff=7 [ 4.666] Build Date: 29 August 2012 12:12:33AM [ 4.666] xorg-server 2:1.11.4-0ubuntu10.8 (For technical support please see http://www.ubuntu.com/support) [ 4.666] Current version of pixman: 0.24.4 [ 4.666] Before reporting problems, check http://wiki.x.org to make sure that you have the latest version. [ 4.666] Markers: (--) probed, (**) from config file, (==) default setting, (++) from command line, (!!) notice, (II) informational, (WW) warning, (EE) error, (NI) not implemented, (??) unknown. [ 4.666] (==) Log file: "/var/log/Xorg.0.log", Time: Thu Dec 13 10:18:02 2012 [ 4.668] (==) Using system config directory "/usr/share/X11/xorg.conf.d" [ 4.668] (==) No Layout section. Using the first Screen section. [ 4.668] (==) No screen section available. Using defaults. [ 4.668] (**) |-->Screen "Default Screen Section" (0) [ 4.668] (**) | |-->Monitor "<default monitor>" [ 4.668] (==) No monitor specified for screen "Default Screen Section". Using a default monitor configuration. [ 4.668] (==) Automatically adding devices [ 4.668] (==) Automatically enabling devices [ 4.668] (WW) The directory "/usr/share/fonts/X11/cyrillic" does not exist. [ 4.668] Entry deleted from font path. [ 4.668] (WW) The directory "/usr/share/fonts/X11/100dpi/" does not exist. [ 4.668] Entry deleted from font path. [ 4.669] (WW) The directory "/usr/share/fonts/X11/75dpi/" does not exist. [ 4.669] Entry deleted from font path. [ 4.669] (WW) The directory "/usr/share/fonts/X11/100dpi" does not exist. [ 4.669] Entry deleted from font path. [ 4.669] (WW) The directory "/usr/share/fonts/X11/75dpi" does not exist. [ 4.669] Entry deleted from font path. [ 4.669] (WW) The directory "/var/lib/defoma/x-ttcidfont-conf.d/dirs/TrueType" does not exist. [ 4.669] Entry deleted from font path. [ 4.669] (==) FontPath set to: /usr/share/fonts/X11/misc, /usr/share/fonts/X11/Type1, built-ins [ 4.669] (==) ModulePath set to "/usr/lib/x86_64-linux-gnu/xorg/extra-modules,/usr/lib/xorg/extra-modules,/usr/lib/xorg/modules" [ 4.669] (II) The server relies on udev to provide the list of input devices. If no devices become available, reconfigure udev or disable AutoAddDevices. [ 4.669] (II) Loader magic: 0x7f6222467b00 [ 4.669] (II) Module ABI versions: [ 4.669] X.Org ANSI C Emulation: 0.4 [ 4.669] X.Org Video Driver: 11.0 [ 4.669] X.Org XInput driver : 16.0 [ 4.669] X.Org Server Extension : 6.0 [ 4.670] (--) PCI:*(0:2:0:0) 10de:08a3:106b:00d3 rev 162, Mem @ 0x92000000/16777216, 0x80000000/268435456, 0x90000000/33554432, I/O @ 0x00001000/128, BIOS @ 0x????????/131072 [ 4.670] (II) Open ACPI successful (/var/run/acpid.socket) [ 4.670] (II) LoadModule: "extmod" [ 4.671] (II) Loading /usr/lib/xorg/modules/extensions/libextmod.so [ 4.671] (II) Module extmod: vendor="X.Org Foundation" [ 4.671] compiled for 1.11.3, module version = 1.0.0 [ 4.671] Module class: X.Org Server Extension [ 4.671] ABI class: X.Org Server Extension, version 6.0 [ 4.671] (II) Loading extension MIT-SCREEN-SAVER [ 4.671] (II) Loading extension XFree86-VidModeExtension [ 4.671] (II) Loading extension XFree86-DGA [ 4.671] (II) Loading extension DPMS [ 4.671] (II) Loading extension XVideo [ 4.671] (II) Loading extension XVideo-MotionCompensation [ 4.671] (II) Loading extension X-Resource [ 4.671] (II) LoadModule: "dbe" [ 4.671] (II) Loading /usr/lib/xorg/modules/extensions/libdbe.so [ 4.671] (II) Module dbe: vendor="X.Org Foundation" [ 4.671] compiled for 1.11.3, module version = 1.0.0 [ 4.671] Module class: X.Org Server Extension [ 4.671] ABI class: X.Org Server Extension, version 6.0 [ 4.671] (II) Loading extension DOUBLE-BUFFER [ 4.671] (II) LoadModule: "glx" [ 4.671] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/libglx.so [ 4.869] (II) Module glx: vendor="NVIDIA Corporation" [ 4.869] compiled for 4.0.2, module version = 1.0.0 [ 4.869] Module class: X.Org Server Extension [ 4.869] (II) NVIDIA GLX Module 295.40 Thu Apr 5 21:57:38 PDT 2012 [ 4.869] (II) Loading extension GLX [ 4.869] (II) LoadModule: "record" [ 4.870] (II) Loading /usr/lib/xorg/modules/extensions/librecord.so [ 4.870] (II) Module record: vendor="X.Org Foundation" [ 4.870] compiled for 1.11.3, module version = 1.13.0 [ 4.870] Module class: X.Org Server Extension [ 4.870] ABI class: X.Org Server Extension, version 6.0 [ 4.870] (II) Loading extension RECORD [ 4.870] (II) LoadModule: "dri" [ 4.870] (II) Loading /usr/lib/xorg/modules/extensions/libdri.so [ 4.870] (II) Module dri: vendor="X.Org Foundation" [ 4.870] compiled for 1.11.3, module version = 1.0.0 [ 4.870] ABI class: X.Org Server Extension, version 6.0 [ 4.870] (II) Loading extension XFree86-DRI [ 4.870] (II) LoadModule: "dri2" [ 4.871] (II) Loading /usr/lib/xorg/modules/extensions/libdri2.so [ 4.871] (II) Module dri2: vendor="X.Org Foundation" [ 4.871] compiled for 1.11.3, module version = 1.2.0 [ 4.871] ABI class: X.Org Server Extension, version 6.0 [ 4.871] (II) Loading extension DRI2 [ 4.871] (==) Matched nvidia as autoconfigured driver 0 [ 4.871] (==) Matched nouveau as autoconfigured driver 1 [ 4.871] (==) Matched nv as autoconfigured driver 2 [ 4.871] (==) Matched vesa as autoconfigured driver 3 [ 4.871] (==) Matched fbdev as autoconfigured driver 4 [ 4.871] (==) Assigned the driver to the xf86ConfigLayout [ 4.871] (II) LoadModule: "nvidia" [ 4.871] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so [ 4.887] (II) Module nvidia: vendor="NVIDIA Corporation" [ 4.887] compiled for 4.0.2, module version = 1.0.0 [ 4.887] Module class: X.Org Video Driver [ 4.892] (II) LoadModule: "nouveau" [ 4.894] (II) Loading /usr/lib/xorg/modules/drivers/nouveau_drv.so [ 4.894] (II) Module nouveau: vendor="X.Org Foundation" [ 4.894] compiled for 1.11.3, module version = 1.0.2 [ 4.894] Module class: X.Org Video Driver [ 4.894] ABI class: X.Org Video Driver, version 11.0 [ 4.894] (II) LoadModule: "nv" [ 4.895] (WW) Warning, couldn't open module nv [ 4.895] (II) UnloadModule: "nv" [ 4.895] (II) Unloading nv [ 4.895] (EE) Failed to load module "nv" (module does not exist, 0) [ 4.895] (II) LoadModule: "vesa" [ 4.895] (II) Loading /usr/lib/xorg/modules/drivers/vesa_drv.so [ 4.896] (II) Module vesa: vendor="X.Org Foundation" [ 4.896] compiled for 1.11.3, module version = 2.3.0 [ 4.896] Module class: X.Org Video Driver [ 4.896] ABI class: X.Org Video Driver, version 11.0 [ 4.896] (II) LoadModule: "fbdev" [ 4.896] (II) Loading /usr/lib/xorg/modules/drivers/fbdev_drv.so [ 4.896] (II) Module fbdev: vendor="X.Org Foundation" [ 4.896] compiled for 1.11.3, module version = 0.4.2 [ 4.896] ABI class: X.Org Video Driver, version 11.0 [ 4.896] (==) Matched nvidia as autoconfigured driver 0 [ 4.896] (==) Matched nouveau as autoconfigured driver 1 [ 4.896] (==) Matched nv as autoconfigured driver 2 [ 4.896] (==) Matched vesa as autoconfigured driver 3 [ 4.896] (==) Matched fbdev as autoconfigured driver 4 [ 4.896] (==) Assigned the driver to the xf86ConfigLayout [ 4.896] (II) LoadModule: "nvidia" [ 4.896] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so [ 4.896] (II) Module nvidia: vendor="NVIDIA Corporation" [ 4.896] compiled for 4.0.2, module version = 1.0.0 [ 4.896] Module class: X.Org Video Driver [ 4.896] (II) UnloadModule: "nvidia" [ 4.896] (II) Unloading nvidia [ 4.896] (II) Failed to load module "nvidia" (already loaded, 32610) [ 4.896] (II) LoadModule: "nouveau" [ 4.897] (II) Loading /usr/lib/xorg/modules/drivers/nouveau_drv.so [ 4.897] (II) Module nouveau: vendor="X.Org Foundation" [ 4.897] compiled for 1.11.3, module version = 1.0.2 [ 4.897] Module class: X.Org Video Driver [ 4.897] ABI class: X.Org Video Driver, version 11.0 [ 4.897] (II) UnloadModule: "nouveau" [ 4.897] (II) Unloading nouveau [ 4.897] (II) Failed to load module "nouveau" (already loaded, 32610) [ 4.897] (II) LoadModule: "nv" [ 4.897] (WW) Warning, couldn't open module nv [ 4.897] (II) UnloadModule: "nv" [ 4.897] (II) Unloading nv [ 4.897] (EE) Failed to load module "nv" (module does not exist, 0) [ 4.897] (II) LoadModule: "vesa" [ 4.898] (II) Loading /usr/lib/xorg/modules/drivers/vesa_drv.so [ 4.898] (II) Module vesa: vendor="X.Org Foundation" [ 4.898] compiled for 1.11.3, module version = 2.3.0 [ 4.898] Module class: X.Org Video Driver [ 4.898] ABI class: X.Org Video Driver, version 11.0 [ 4.898] (II) UnloadModule: "vesa" [ 4.898] (II) Unloading vesa [ 4.898] (II) Failed to load module "vesa" (already loaded, 0) [ 4.898] (II) LoadModule: "fbdev" [ 4.898] (II) Loading /usr/lib/xorg/modules/drivers/fbdev_drv.so [ 4.898] (II) Module fbdev: vendor="X.Org Foundation" [ 4.898] compiled for 1.11.3, module version = 0.4.2 [ 4.898] ABI class: X.Org Video Driver, version 11.0 [ 4.898] (II) UnloadModule: "fbdev" [ 4.898] (II) Unloading fbdev [ 4.899] (II) Failed to load module "fbdev" (already loaded, 0) [ 4.899] (II) NVIDIA dlloader X Driver 295.40 Thu Apr 5 21:38:35 PDT 2012 [ 4.899] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs [ 4.899] (II) NOUVEAU driver Date: Wed Sep 12 13:42:43 2012 +0200 [ 4.899] (II) NOUVEAU driver for NVIDIA chipset families : [ 4.899] RIVA TNT (NV04) [ 4.899] RIVA TNT2 (NV05) [ 4.899] GeForce 256 (NV10) [ 4.899] GeForce 2 (NV11, NV15) [ 4.899] GeForce 4MX (NV17, NV18) [ 4.899] GeForce 3 (NV20) [ 4.900] GeForce 4Ti (NV25, NV28) [ 4.900] GeForce FX (NV3x) [ 4.900] GeForce 6 (NV4x) [ 4.900] GeForce 7 (G7x) [ 4.900] GeForce 8 (G8x) [ 4.900] GeForce GTX 200 (NVA0) [ 4.900] GeForce GTX 400 (NVC0) [ 4.900] (II) VESA: driver for VESA chipsets: vesa [ 4.900] (II) FBDEV: driver for framebuffer: fbdev [ 4.900] (++) using VT number 7 [ 4.902] (II) Loading sub module "fb" [ 4.902] (II) LoadModule: "fb" [ 4.902] (II) Loading /usr/lib/xorg/modules/libfb.so [ 4.902] (II) Module fb: vendor="X.Org Foundation" [ 4.902] compiled for 1.11.3, module version = 1.0.0 [ 4.902] ABI class: X.Org ANSI C Emulation, version 0.4 [ 4.902] (II) Loading sub module "wfb" [ 4.902] (II) LoadModule: "wfb" [ 4.903] (II) Loading /usr/lib/xorg/modules/libwfb.so [ 4.905] (II) Module wfb: vendor="X.Org Foundation" [ 4.905] compiled for 1.11.3, module version = 1.0.0 [ 4.905] ABI class: X.Org ANSI C Emulation, version 0.4 [ 4.905] (II) Loading sub module "ramdac" [ 4.905] (II) LoadModule: "ramdac" [ 4.905] (II) Module "ramdac" already built-in [ 4.907] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so [ 4.907] (II) Loading /usr/lib/xorg/modules/libwfb.so [ 4.907] (II) Loading /usr/lib/xorg/modules/libfb.so [ 4.912] (WW) Falling back to old probe method for vesa [ 4.912] (WW) Falling back to old probe method for fbdev [ 4.912] (II) Loading sub module "fbdevhw" [ 4.912] (II) LoadModule: "fbdevhw" [ 4.912] (II) Loading /usr/lib/xorg/modules/libfbdevhw.so [ 4.912] (II) Module fbdevhw: vendor="X.Org Foundation" [ 4.912] compiled for 1.11.3, module version = 0.0.2 [ 4.912] ABI class: X.Org Video Driver, version 11.0 [ 4.912] (II) NVIDIA(0): Creating default Display subsection in Screen section "Default Screen Section" for depth/fbbpp 24/32 [ 4.912] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32 [ 4.912] (==) NVIDIA(0): RGB weight 888 [ 4.912] (==) NVIDIA(0): Default visual is TrueColor [ 4.912] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0) [ 4.912] (**) NVIDIA(0): Enabling 2D acceleration [ 5.442] (EE) NVIDIA(0): Failed to initialize the display subsystem for the NVIDIA [ 5.442] (EE) NVIDIA(0): graphics device! [ 5.442] (EE) NVIDIA(0): Failed to get supported display device(s) [ 5.442] (EE) NVIDIA(0): Failed to initialize dac HAL [ 5.442] (II) UnloadModule: "nvidia" [ 5.442] (II) Unloading nvidia [ 5.442] (II) UnloadModule: "wfb" [ 5.442] (II) Unloading wfb [ 5.442] (II) UnloadModule: "fb" [ 5.443] (II) Unloading fb [ 5.443] (EE) Screen(s) found, but none have a usable configuration. [ 5.443] Fatal server error: [ 5.443] no screens found [ 5.443] Please consult the The X.Org Foundation support at http://wiki.x.org for help. [ 5.443] Please also check the log file at "/var/log/Xorg.0.log" for additional information. [ 5.443] [ 5.447] ddxSigGiveUp: Closing log [ 5.447] Server terminated with error (1). Closing log file.

    Read the article

  • Some problem running NUnit

    - by prosseek
    I have NUnit installed at this directory. C:\Program Files\NUnit 2.5.5\bin\net-2.0 When I try to run my unit test (mut.dll) in some random directory. I get the following error. I have to copy the mut.dll under the NUnit directory in order to run it. ProcessModel: Default DomainUsage: Single Execution Runtime: net-2.0 Could not load file or assembly 'nunit.framework, Version=2.5.5.10112, Culture=n eutral, PublicKeyToken=96d09a1eb7f44a77' or one of its dependencies. The system cannot find the file specified. What's wrong? Is there anything that I have to configure to run NUNit under any directory?

    Read the article

< Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >