Search Results

Search found 25974 results on 1039 pages for 'source routing'.

Page 428/1039 | < Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >

  • SQLiteDataAdapter Update method returning 0 C#

    - by Lirik
    I loaded 83 rows from my CSV file, but when I try to update the SQLite database I get 0 rows... I can't figure out what I'm doing wrong. The program outputs: Num rows loaded is 83 Num rows updated is 0 The source code is: public void InsertData(String csvFileName, String tableName) { String dir = Path.GetDirectoryName(csvFileName); String name = Path.GetFileName(csvFileName); using (OleDbConnection conn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + dir + @";Extended Properties=""Text;HDR=Yes;FMT=Delimited\""")) { conn.Open(); using (OleDbDataAdapter adapter = new OleDbDataAdapter("SELECT * FROM " + name, conn)) { QuoteDataSet ds = new QuoteDataSet(); adapter.Fill(ds, tableName); Console.WriteLine("Num rows loaded is " + ds.Tags.Rows.Count); InsertData(ds, tableName); } } } public void InsertData(QuoteDataSet data, String tableName) { using (SQLiteConnection conn = new SQLiteConnection(_connectionString)) { using (SQLiteDataAdapter sqliteAdapter = new SQLiteDataAdapter("SELECT * FROM " + tableName, conn)) { using (new SQLiteCommandBuilder(sqliteAdapter)) { conn.Open(); Console.WriteLine("Num rows updated is " + sqliteAdapter.Update(data, tableName)); } } } } Any hints on why it's not updating the correct number of rows?

    Read the article

  • Trying to use a authlogic-connect as a plugin in place of gem - Server doesn't start

    - by Arkid
    I am trying to use Authlogic-connect as a plugin in Rails 3 in place of a gem. I have made an entry in the gemfile as gem "authlogic-connect", :require => "authlogic-connect", :path => "localgems" Now when I run the bundle install, it runs fine. When I try to start the server i get the error Could not find gem 'authlogic-connect (>= 0, runtime)' in source at localgems. Source does not contain any versions of 'authlogic-connect (>= 0, runtime)' Try running `bundle install`. I have placed the unzipped Gem renamed as authlogic-connect in the localgems folder. what is the problem? Here is what I get on using rails plugin install arkidmitra$ rails plugin install git://github.com/viatropos/authlogic-connect.git Usage: rails new APP_PATH [options] Options: [--skip-gemfile] # Don't create a Gemfile -d, [--database=DATABASE] # Preconfigure for selected database (options: mysql/oracle/postgresql/sqlite3/frontbase/ibm_db) # Default: sqlite3 -O, [--skip-active-record] # Skip Active Record files [--dev] # Setup the application with Gemfile pointing to your Rails checkout -J, [--skip-prototype] # Skip Prototype files -T, [--skip-test-unit] # Skip Test::Unit files -G, [--skip-git] # Skip Git ignores and keeps -r, [--ruby=PATH] # Path to the Ruby binary of your choice # Default: /System/Library/Frameworks/Ruby.framework/Versions/1.8/usr/bin/ruby -m, [--template=TEMPLATE] # Path to an application template (can be a filesystem path or URL) -b, [--builder=BUILDER] # Path to an application builder (can be a filesystem path or URL) [--edge] # Setup the application with Gemfile pointing to Rails repository Runtime options: -q, [--quiet] # Supress status output -s, [--skip] # Skip files that already exist -f, [--force] # Overwrite files that already exist -p, [--pretend] # Run but do not make any changes Rails options: -h, [--help] # Show this help message and quit -v, [--version] # Show Rails version number and quit Description: The 'rails new' command creates a new Rails application with a default directory structure and configuration at the path you specify. Example: rails new ~/Code/Ruby/weblog This generates a skeletal Rails installation in ~/Code/Ruby/weblog. See the README in the newly created application to get going.

    Read the article

  • get column names from a table where one of the column name is a key word.

    - by syedsaleemss
    Im using c# .net windows form application. I have created a database which has many tables. In one of the tables I have entered data. In this table I have 4 columns named key, name,age,value. Here the name "key" of the first column is a key word. Now I am trying to get these column names into a combo box. I am unable to get the name "key". It works for "key" when I use this code: private void comboseccolumn_SelectedIndexChanged(object sender, EventArgs e) { string dbname = combodatabase.SelectedItem.ToString(); string path = @"Data Source=" + textBox1.Text + ";Initial Catalog=" + dbname + ";Integrated Security=SSPI"; //string path=@"Data Source=SYED-PC\SQLEXPRESS;Initial Catalog=resources;Integrated Security=SSPI"; SqlConnection con = new SqlConnection(path); string tablename = comboBox2.SelectedItem.ToString(); //string query= "Select * from" +tablename+; //SqlDataAdapter adp = new SqlDataAdapter(" Select [Key] ,value from " + tablename, con); SqlDataAdapter adp = new SqlDataAdapter(" Select [" + combofirstcolumn.SelectedItem.ToString() + "]," + comboseccolumn.SelectedItem.ToString() + "\t from " + tablename, con); DataTable dt = new DataTable(); adp.Fill(dt); dataGridView1.DataSource = dt; } This is beacuse I am using "[" in the select query. But it wont work for non keys. Or if I remove the "[" it is not working for key . Please suggest me so that I can get both key as well as nonkey column names.

    Read the article

  • Apache Axis: How to set call properties using code generated from wsdl2java?

    - by marc esher
    I'm using Apache Axis 1.4 (yes, the old one), with wsdl2java to generate the client code for a webservice. I'd like to set additional properties on the Call object before calling methods on the generated stub. For example, I'd like to set username, password, perhaps add or modify existing headers, and change the client handlers to use different implementations. Currently, I'm doing this by modifying the generated Stub class and calling the appropriate setters. However, I'd like to achieve this without touching the generated files. I"m confused, though, because the Stub class has: createCall() which creates the call object and sets some properties. Currently, this is where I'm modifying the generated source code; then, the Stub contains: clientMethod1(){ blahblah Call _call = createCall(); ...... _call.invoke(); } So I can't see a way that I can use the serviceLocator to get a stub, modify the properties I want to modify, and then use the stub to call the methods I want to call, given that the stub methods call createCall() and then call invoke. There doesn't appear to be a way to intercept the new Call object before it's invoked. So: How do you modify properties in the call without modifying the generated Stub class's source code? Thanks for info or even pointers to existing documentation.

    Read the article

  • Implementing commands in MMVM

    - by user312372
    I am trying to change the source property of the frame in Page1.xaml when the SampleCommand is excecuted. How do I acheive this in the View Model? Page1.xaml: <Page ...> <DockPanel> <r:ribbon> <r:RibbonTab Label="Keys"> <r:RibbonTab.Groups> <r:RibbonGroup GroupSizeDefinitions="{StaticResource RibbonLayout}"> <r:RibbonGroup.Command> <r:RibbonCommand LabelTitle="RibbonButton"/> </r:RibbonGroup.Command> <r:RibbonButton x:Name="RibbonButton1" Command="{Binding Path=SampleCommand}"/> </r:RibbonGroup> </r:RibbonTab.Groups> </r:RibbonTab> </r:Ribbon> <Border Name="PageBorder" Grid.Row="0" Grid.Column="1"> <Frame Name="pageFrame" Source="FirstPage.xaml" /> </Border> </DockPanel> </Page> Page1ViewModel.cs: RelayCommand _sampleCommand; public ICommand SampleCommand { get { // create command ?? return _sampleCommand } } page1.xaml.cs: Page1ViewModel pageViewModel; //When page loads this.DataContext = pageViewModel;

    Read the article

  • Publish Maven artifacts on FTP with Hudson FTP Publisher Plugin

    - by jaguard
    I'm building a number of artefacts (zip files for different environments: test, dev) using the maven-assembly-plugin using a specialized Maven profile. These artefacts I want to copy/collect on on a FTP server keeping the version (01.07.10.16.Wed-1626) as a folder, so I need to copy from test/build/01.07.10.16.Wed-1626/ to ftp://my-ftp-server:21/projects/myserver-1.7/01.07.10.16.Wed-1626/ The layout for the Maven output is this: target/ build/ 01.07.10.16.Wed-1626/ my-server-01.07.10.16.Wed-1626-dev.zip my-server-01.07.10.16.Wed-1626-test.zip For copying the artefacts I'm using FTP Publisher Plugin but it seams I miss something since that even the build is OK and the artefacts are build without problem but the job is finishing without copying the artefacts, and in the console there is no log info about copying the artefacts My FTP publisher config (FTP repository hosts) is: Hostname: my-ftp-server Port: 21 Timeout: 10000 Root Repository Path: projects User Name: my-user Password: my-pass My Hudson job FTP publisher config (Publish artifacts to FTP) is: FTP site: my-ftp-server Files to upload Source: target/build/** Destination: myserver-1.7 1: There is any log to check if there are any FTP copy errors ? 2: There is any problem with the file pattern (source) or with the dest ?

    Read the article

  • Chain of DataBinding

    - by Neir0
    Hello I am trying to do follow DataBinding Property -> DependencyProperty -> Property But i have trouble. For example, We have simple class with two properties implements INotifyPropertyChanged: public class MyClass : INotifyPropertyChanged { private string _num1; public string Num1 { get { return _num1; } set { _num1 = value; OnPropertyChanged("Num1"); } } private string _num2; public string Num2 { get { return _num2; } set { _num2 = value; OnPropertyChanged("Num2"); } } public event PropertyChangedEventHandler PropertyChanged; public void OnPropertyChanged(string e) { PropertyChangedEventHandler handler = PropertyChanged; if (handler != null) handler(this, new PropertyChangedEventArgs(e)); } } And TextBlock declared in xaml: <TextBlock Name="tb" FontSize="20" Foreground="Red" Text="qwerqwerwqer" /> Now lets trying to bind Num1 to tb.Text: private MyClass _myClass = new MyClass(); public MainWindow() { InitializeComponent(); Binding binding1 = new Binding("Num1") { Source = _myClass, Mode = BindingMode.OneWay }; Binding binding2 = new Binding("Num2") { Source = _myClass, Mode = BindingMode.TwoWay }; tb.SetBinding(TextBlock.TextProperty, binding1); //tb.SetBinding(TextBlock.TextProperty, binding2); var timer = new Timer(500) {Enabled = true,}; timer.Elapsed += (sender, args) => _myClass.Num1 += "a"; timer.Start(); } It works well. But if we uncomment this string tb.SetBinding(TextBlock.TextProperty, binding2); then TextBlock display nothing. DataBinding doesn't work! How can i to do what i want?

    Read the article

  • Virtual Lan on the Cloud -- Help Confirm my understanding?

    - by marfarma
    [Note: Tried to post this over at ServerFault, but I don't have enough 'points' for more than one link. Powers that be, move this question over there.] Please give this a quick read and let me know if I'm missing something before I start trying to make this work. I'm not a systems admin professional, and I'd hate to end up banging my head into the wall if I can avoid it. Goals: Create a 'road-warrior' capable star shaped virtual LAN for consultants who spend the majority of their time on client sites, and who's firm has no physical network or servers. Enable CIFS access to a cloud-server based installation of Alfresco Allow Eventual implementation of some form of single-sign-on ( OpenLDAP server ) access to Alfresco and other server applications implemented in the future Given: All Servers will live in the public internet cloud (Rackspace Cloud Servers) OpenVPN Server will be a Linux disto, probably Ubuntu 9.x, installed on same server as Alfresco (at least to start) Staff will access server applications and resources from client sites, hotels, trains, planes, coffee shops or their homes over various ISP, using their company laptops or personal home desktops. Based on my Research thus far, to accomplish this, I'll need: OpenVPN with Bridging Enabled to create a star shaped "virtual" LAN http://openvpn.net/index.php/open-source/documentation/miscellaneous/76-ethernet-bridging.html A Road Warrior Network Configuration, as described in this Shorewall article (lower down the page) http://www.shorewall.net/OPENVPN.html Configure bridge addressesing (probably DHCP) http://openvpn.net/index.php/open-source/faq.html#bridge-addressing Configure CIFS / Samba to accept VPN IP address http://serverfault.com/questions/137933/howto-access-samba-share-over-vpn-tunnel Set up Client software, with keys configured for access (potentially through a OpenVPN-Sa client portal) http://www.openvpn.net/index.php/access-server/download-openvpn-as/221-installation-overview.html

    Read the article

  • configure.in: AM_DISABLE_SHARED doesn't change my Makefile

    - by t2k32316
    I'm extremely new to using Makefiles and autoconf. I'm using the Camellia image library and trying to statically link my code against their libraries. When I run "make" on the Camellia image library, I get libCamellia.a, .so, .la, and .so.0.0.0 files inside my /usr/local/lib directory. This is the command I use to compile my code with their libraries: gcc -L/usr/local/lib -lCamellia -o myprogram myprogram.c This works fine, but when I try to statically link, this is what I get: gcc -static -L/usr/local/lib -lCamellia -o myprogram myprogram.c /tmp/cck0pw70.o: In function `main': myprogram.c:(.text+0x23): undefined reference to `camLoadPGM' myprogram.c:(.text+0x55): undefined reference to `camAllocateImage' myprogram.c:(.text+0x97): undefined reference to `camZoom2x' myprogram.c:(.text+0x104): undefined reference to `camSavePGM' collect2: ld returned 1 exit status I want to statically link because I'm trying to modify the Camellia source code and I want to compare my version against theirs. So after some googling, I tried adding AM_DISABLE_SHARED into the configure.in file. But after running ./configure, I still get the exact same Makefile. After I "make install", I still get the same results above. What is an easy way to get two versions of my code, one with the original Camellia source code compiled and one with my modified version? I think static libraries should work. There is an easy way to get static libraries working or are there other simple solutions to my problem? I just don't want to re-"make" and re-"make install" everytime I want to compare my version against the original.

    Read the article

  • Using Java PDFBox library to write Russian PDF

    - by Brad
    Hello , I am using a Java library called PDFBox trying to write text to a PDF. It works perfect for English text, but when i tried to write Russian text inside the PDF the letters appeared so strange. It seems the problem is in the font used, but i am not so sure about that, so i hope if anyone could guide me through this. Here is the important code lines : PDTrueTypeFont font = PDTrueTypeFont.loadTTF( pdfFile, new File( "fonts/VREMACCI.TTF" ) ); // Windows Russian font imported to write the Russian text. font.setEncoding( new WinAnsiEncoding() ); // Define the Encoding used in writing. // Some code here to open the PDF & define a new page. contentStream.drawString( "??????? ????????????" ); // Write the Russian text. The WinAnsiEncoding source code is : Click here --------------------- Edit on 18 November 2009 After some investigation, i am now sure it is an Encoding problem, this could be solved by defining my own Encoding using the helpful PDFBox class called DictionaryEncoding. I am not sure how to use it, but here is what i have tried until now : COSDictionary cosDic = new COSDictionary(); cosDic.setString( COSName.getPDFName("Ercyrillic"), "0420 " ); // Russian letter. font.setEncoding( new DictionaryEncoding( cosDic ) ); This does not work, as it seems i am filling the dictionary in a wrong way, when i write a PDF page using this it appears blank. The DictionaryEncoding source code is : Click here Thanks . . .

    Read the article

  • Using git submodules in a git-svn project

    - by Matthias
    In our git-svn managed project, we have 3 upstream projects that are all kept in native git repositories on GitHub. Since the source code of those upstream projects is under our control and changes frequently, our current solution, namely re-deploying the build artifacts to the super-project everytime we change something is quite cumbersome. What I'd like to have is this: parent project (git-svn): --> submodule 1 (git) --> submodule 2 (git) --> submodule 3 (git) That way, the source code for submodules 1-3 is compiled along with the sources for the super project, but I can push changes to submodules separately. The question is: what happens when I git svn dcommit on the parent project? Does this even work? UPDATE Hm, I just set up a simple project structure, trying to resemble this scenario, and I receive this error message when trying to dcommit on the superproject: a0301b11f3544a1e71067ff270eded65e4c8afbd doesn't exist in the repository at /opt/local/libexec/git-core/git-svn line 4775 Failed to read object a0301b11f3544a1e71067ff270eded65e4c8afbd at /opt/local/libexec/git-core/git-svn line 574 Any ideas/suggestions?

    Read the article

  • Subversion versus Vault

    - by WebDude
    I'm currently reviewing the benefits of moving from SVN to a SourceGear Vault. Has anyone got advice or a link to a detailed comparison between the two? Bear in mind I would have to move my current Source Control system across which works strongly in SVN's favor Here is some info I have found out thus far from my own investigations. I have been taking some time tests between the two and vault seems to perform most operations much faster. Time tests used the same server as the repository, the same workstation client, and the same project. Time Comparisons SVN Add/Commit    12:30 Get Latest Revision    5:35 Tagging/Labelling    0:01 Branching    N/A - I don't think true branching exists in SVN Vault Add/Commit    4:45 Get Latest Revision    0:51 Tagging/Labelling    0:30 Branching    3:23 (can't get this to format correctly) I also found an online source comparing some other points. This is the kind of information i'm looking for. Usage Comparisons Subversion is edit/merge/commit only. Vault allows you to do either edit/merge/commit or checkout/edit/checkin. Vault looks and acts just like VSS, which makes the learning curve effectively zero for VSS users. Vault has a VS plugin, but it only works if you're going to run in checkout-mode. Subversion has clients for pretty much every OS you can imagine; Vault has a GUI client for Windows and a command line client for Mono. Both will support remote work, since both use HTTP as their transport (Subversion uses extended DAV, Vault uses SOAP). Subversion installation, especially w/ Apache, is more complex. Subversion has a lot of third party support. Vault has just a few things. My question Has anyone got advice or a link to a detailed comparison between the two?

    Read the article

  • Weblogic 10.3.0 : Loosing a stateless session bean in the bean pool

    - by KlasE
    Hi, We have a strange situation where we loose a Stateless SessionBean in a Bean Pool in Weblogic 10.3.0 Since we only have one bean in the pool, this effectively hangs all incoming calls. We do not want more than one instance in the pool because of application restrictions. In the Weblogic admin console, we can see that there are 1 instance in the bean pool, 0 beans in use and 1 waiting incoming request. The question is, what can have caused the system to not send the request to the one obviously free bean instance? This happens after several hours and over 100000 incoming requests, and the same scenario worked fine in the old weblogic 8 environment. We get the following stacktrace: "[ACTIVE] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'" waiting for lock java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@b0d484 TIMED_WAITING sun.misc.Unsafe.park(Native Method) java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:198) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2054) weblogic.ejb.container.pool.StatelessSessionPool.waitForBean(StatelessSessionPool.java:269) weblogic.ejb.container.pool.StatelessSessionPool.getBean(StatelessSessionPool.java:111) weblogic.ejb.container.manager.StatelessManager.preInvoke(StatelessManager.java:148) weblogic.ejb.container.internal.BaseRemoteObject.preInvoke(BaseRemoteObject.java:227) weblogic.ejb.container.internal.StatelessRemoteObject.preInvoke(StatelessRemoteObject.java:52) com.mycompany.beans.MessageLogFacace_n73y0z_EOImpl.isMyStuffValid(MessageLogFacace_n73y0z_EOImpl.java:261) com.mycompany.beans.MessageLogFacace_n73y0z_EOImpl_WLSkel.invoke(Unknown Source) weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:589) weblogic.rmi.cluster.ClusterableServerRef.invoke(ClusterableServerRef.java:230) weblogic.rmi.internal.BasicServerRef$1.run(BasicServerRef.java:477) weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:363) weblogic.security.service.SecurityManager.runAs(Unknown Source) weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:473) weblogic.rmi.internal.wls.WLSExecuteRequest.run(WLSExecuteRequest.java:118) weblogic.work.ExecuteThread.execute(ExecuteThread.java:201) weblogic.work.ExecuteThread.run(ExecuteThread.java:173) Any help would be very welcome. Regards Klas

    Read the article

  • How to force ADO.Net to use only the System.String DataType in the readers TableSchema.

    - by Keith Sirmons
    Howdy, I am using an OleDbConnection to query an Excel 2007 Spreadsheet. I want force the OleDbDataReader to use only string as the column datatype. The system is looking at the first 8 rows of data and inferring the data type to be Double. The problem is that on row 9 I have a string in that column and the OleDbDataReader is returning a Null value since it could not be cast to a Double. I have used these connection strings: Provider=Microsoft.ACE.OLEDB.12.0;Data Source="ExcelFile.xlsx";Persist Security Info=False;Extended Properties="Excel 12.0;IMEX=1;HDR=No" Provider=Microsoft.Jet.OLEDB.4.0;Data Source="ExcelFile.xlsx";Persist Security Info=False;Extended Properties="Excel 8.0;HDR=No;IMEX=1" Looking at the reader.GetSchemaTable().Rows[7].ItemArray[5], it's dataType is Double. Row 7 in this schema correlates with the specific column in Excel I am having issues with. ItemArray[5] is its DataType column Is it possible to create a custom TableSchema for the reader so when accessing the ExcelFiles, I can treat all cells as text instead of letting the system attempt to infer the datatype? I found some good info at this page: Tips for reading Excel spreadsheets using ADO.NET The main quirk about the ADO.NET interface is how datatypes are handled. (You'll notice I've been carefully avoiding the question of which datatypes are returned when reading the spreadsheet.) Are you ready for this? ADO.NET scans the first 8 rows of data, and based on that guesses the datatype for each column. Then it attempts to coerce all data from that column to that datatype, returning NULL whenever the coercion fails! Thank you, Keith Here is a reduced version of my code: using (OleDbConnection connection = new OleDbConnection(BuildConnectionString(dataMapper).ToString())) { connection.Open(); using (OleDbCommand cmd = new OleDbCommand()) { cmd.Connection = connection; cmd.CommandText = SELECT * from [Sheet1$]; using (OleDbDataReader reader = cmd.ExecuteReader()) { using (DataTable dataTable = new DataTable("TestTable")) { dataTable.Load(reader); base.SourceDataSet.Tables.Add(dataTable); } } } }

    Read the article

  • How to change identifier quote character in SSIS for connection to ODBC DSN

    - by William Rose
    I'm trying to create an SSIS 2008 Data Source View that reads from an Ingres database via the ODBC driver for Ingres. I've downloaded the Ingres 10 Community Edition to get the ODBC driver, installed it, set up the data access server and a DSN on the server running SSIS. If I connect to the SQL Server 2008 Database Engine on the server running SSIS, I can retrieve data from Ingres over the ODBC DSN by running the following command: SELECT * FROM OPENROWSET( 'MSDASQL' , 'DSN=IngresODBC;UID=testuser;PWD=testpass' , 'SELECT * FROM iitables') So I am quite sure that the ODBC setup is correct. If I try the same query with SQL Server style bracketed identifier quotes, I get an error, as Ingres doesn't support this syntax. SELECT * FROM OPENROWSET( 'MSDASQL' , 'DSN=IngresODBC;UID=testuser;PWD=testpass' , 'SELECT * FROM [iitables]') The error is "[Ingres][Ingres 10.0 ODBC Driver][Ingres 10.0]line 1, Unexpected character '['.". What I am finding is that I get the same error when I try to add tables from Ingres to an SSIS Data Source View. The initial step of selecting the ODBC Provider works fine, and I am shown a list of tables / views to add. I then select any table, and try to add it to the view, and get "ERROR [5000A] [Ingres][Ingres 10.0 ODBC Driver][Ingres 10.0]line 3, Unexpected character '['.". Following Ed Harper's suggestion of creating a named query also seems to be stymied. If I put into my named query the following text: SELECT * FROM "iitables" I still get an error: "ERROR [5000A] [Ingres][Ingres 10.0 ODBC Driver][Ingres 10.0]line 2, Unexpected character '['". According to the error, the query text passed by SSIS to ODBC was: SELECT [iitables].* FROM ( SELECT * FROM "iitables" ) AS [iitables] It seems that SSIS assumes that bracket quote characters are acceptable, when they aren't. How can I persuade it not to use them? Double quotes are acceptable.

    Read the article

  • Dojo DnD: how to access newly copied node on onDndDrop event?

    - by toshinao
    Hi. I am working on code like the following. 01: var c1 = new dojo.dnd.Source('container1', {copyOnly:true}); // container1 is a div 02: var c2 = new dojo.dnd.Source('container2'); // container2 is a div 03: var list = []; 04: for (var i = 0; i < 3; i++) { list.push( dojo.create('div') ); } 05: c1.insertNodes(false, list); 06: 07: function checkDndCopy(nodes, target){ 08: dojo.forEach(nodes, function(node){ alert(node.id); } ); 09: } 10: dojo.subscribe("/dnd/drop", function(){ 11: var mgr = dojo.dnd.manager(); 12: checkDndCopy(mgr.nodes, mgr.target); 13: }); The nodes inserted to the c1 at line 05 have id of "dojoUnique1, donoUnique2, dojoUnique3". On a event of drag and drop a node from c1 to c2, a onDndDrop event is fired and the subscribe method defined in line10-13 is invoked. I expected that newly copied node appears in the nodes (for example) at line 08. But this is not true. When dojoUnique1 is target of drag and drop, nodes at line 08 contains only dojoUnique1. I want to modify some attributes of newly copied nodes on the event of onDndDrop. Please let me know how such a thing is realized.

    Read the article

  • Deploying Asp.Net MVC 2 /C# 4.0 application on IIS 6

    - by Mose
    Hi, I got a problem migrating from VS.Net 2008 / MVC 1 to VS.NET 2010 (+C# 4.0) / MVC 2 The web.config has been updated, the site runs well in Cassini, but my problem now is deploying on IIS 6. I updated the web site to run using ASP.Net 4, but whatever URL I try, I always have a 404 error. It's as if the routing was not taken into account (yes, the wildcard mapping has been done). I do not understand this mess and could not google anything interesting... Thanks for your suggestions ! O.

    Read the article

  • Java Playing Cards Game Framework

    - by isme
    My friends and I at uni love playing Shithead into the wee hours. But soon we graduate and will leave town, so probably won't get together for a game for a while. I want to develop a Java app we can use to play Shithead and our other favorites over a network. An app like this already exists, but is ugly, buggy and does not support our house rules. The source is available, but is such a mess that I would really rather start from scratch than try to refactor it! Building my game using some standard playing card api or framework, if such a thing exists, would be better than starting from scratch. The answer to a similar SO question was to use the JPC-API, which allegedly provides basic playing card services and rendering. But this Sourceforge project currently makes available no files or source code! Is there an alternative, or somewhere else to find this framework? Soon I will need help with the following as well: Lobby services (finding other players) Gaming network protocol (to communicate moves between players) Gaming theory (to write the computer opponent) Winning condition detection Game UI development

    Read the article

  • What AOP tools exist for doing aspect-oriented programming at the assembly language level against x8

    - by JohnnySoftware
    Looking for a tool I can use to do aspect-oriented programming at the assembly language level. For experimentation purposes, I would like the code weaver to operate native application level executable and dynamic link libraries. I have already done object-oriented AOP. I know assembly language for x86 and so forth. I would like to be able to do logging and other sorts of things using the familiar before/after/around constructs. I would like to be able to specify certain instructions or sequences/patterns of consecutive instructions as what to do a pointcut on since assembly/machine language is not exactly the most semantically rich computer language on the planet. If debugger and linker symbols are available, naturally, I would like to be able to use them to identify subroutines' entry points , branch/call/jump target addresses, symbolic data addresses, etc. I would like the ability to send notifications out to other diagnostic tools. Thus, support for sending data through connection-oriented sockets and datagrams is highly desirable. So is normal logging to files, UI, etc. This can be done using the action part of an aspect to make a function call, but then there are portability issues so the tool needs to support a flexible, well-abstracted logging/notifying mechanism with a clean, simple yet flexible. The goal is rapid-QA. The idea is to be able to share aspect source code braodly within communties as well as publicly. So, there needs to be a declarative security policy file that users can share. This insures that nothing untoward that is hidden directly or indirectly in an aspect source file slips by the execution manager. The policy file format needs to be simple to read, write, modify, understand, type-in, edit, and generate. Sort of like Java .policy files. Think the exact opposite of anything resembling XML Schema files and you get the idea. Is there such a tool in existence already?

    Read the article

  • How to dispatch a new property value in an object to the same property of two other objects

    - by WPFadvocate
    In WPF, I've three objects exposing the same DependencyProperty (let's say it's an integer). I want all three property values to remain synchronized, i.e. that whenever the int value changes in an object, this value is propagated to the two other objects. I think of multibinding to do the job, but I don't know how to detect which object changed, thus which value should be used and propagated to the other objects. Edited: here is my tentative code for multibinding, with the false hope that it would work without additional code: // create the multibinding MultiBinding mb = new MultiBinding() { Mode = BindingMode.TwoWay, UpdateSourceTrigger = UpdateSourceTrigger.PropertyChanged }; // create individual bindings to associate object_2 and object_3 to object_1 Binding b2 = new Binding() { Source = object_2, Path = new PropertyPath("X") }; Binding b3 = new Binding() { Source = object_3, Path = new PropertyPath("X") }; // add individual bindings to multibinding mb.Bindings.Add(b2); mb.Bindings.Add(b3); // bind object_2 and _3 to object_1 BindingOperations.SetBinding(object_1, TypeObject_1.XProperty, mb); But actually, there is a runtime error, saying the binding set by the last instruction is lacking a converter. But again I don't know how to write this converter (there is nothing to convert (as this is the case in the related MS sample of code linking 3 rgb properties to a color property), only to forward the value of the property changed to the two other properties). I understand I could solve the problem by creating an X_Changed event in the 3 types and then have each object registering to the two other objects event. I don't like this "manual" way and would prefer to bind the 3 properties together.

    Read the article

  • Migrating Ruby Site from EngineYard to Heroku

    - by user410925
    As part of a larger project I've been tasked with migrating some existing Ruby on Rails sites (built with an old version of refinerycms 0.9.6.34, at least that's the version listed in the Gemfile included with the source). I don't normally work with Ruby so I'm at a bit of a loss. The previous developers simply handed over the latest git dump as well as a db dump. I'm working first with trying to get the site up working locally on an Ubuntu 11.10 local machine before pushing up to at test Heroku install. If it's possible to just push directly to Heroku with the files they gave, then I can try that, but it's my understanding I need to get everything working and then use Heroku's tools to deploy. The previous devs said they're using ruby 1.8.7 so in Ubuntu I've done the following: aptitude install ruby1.8 ruby1.8-dev ruby1.8-full aptitude install rubygems1.8 I've restored the database and in the config directory I've made changes to the database.yml to point to the restored database. When I try and run "bundle install" from the root of the extracted source dir I get: Invalid gemspec in [/var/lib/gems/1.8/specifications/mail-2.4.4.gemspec]: invalid date format in specification: "2012-03-14 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/tilt-1.3.3.gemspec]: invalid date format in specification: "2011-08-25 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/mime-types-1.18.gemspec]: invalid date format in specification: "2012-03-21 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/sass-rails-3.2.5.gemspec]: invalid date format in specification: "2012-03-19 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/jquery-rails-2.0.2.gemspec]: invalid date format in specification: "2012-04-03 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/mail-2.4.4.gemspec]: invalid date format in specification: "2012-03-14 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/tilt-1.3.3.gemspec]: invalid date format in specification: "2011-08-25 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/mime-types-1.18.gemspec]: invalid date format in specification: "2012-03-21 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/sass-rails-3.2.5.gemspec]: invalid date format in specification: "2012-03-19 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/jquery-rails-2.0.2.gemspec]: invalid date format in specification: "2012-04-03 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/mail-2.4.4.gemspec]: invalid date format in specification: "2012-03-14 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/tilt-1.3.3.gemspec]: invalid date format in specification: "2011-08-25 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/mime-types-1.18.gemspec]: invalid date format in specification: "2012-03-21 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/sass-rails-3.2.5.gemspec]: invalid date format in specification: "2012-03-19 00:00:00.000000000Z" Invalid gemspec in [/var/lib/gems/1.8/specifications/jquery-rails-2.0.2.gemspec]: invalid date format in specification: "2012-04-03 00:00:00.000000000Z" Fetching gem metadata from https://rubygems.org/....... Fetching gem metadata from https://rubygems.org/.. Using rake (0.9.2.2) Using i18n (0.6.0) Using multi_json (1.3.6) Using activesupport (3.2.3) Using builder (3.0.0) Using activemodel (3.2.3) Using erubis (2.7.0) Using journey (1.0.3) Using rack (1.4.1) Using rack-cache (1.2) Using rack-test (0.6.1) Using hike (1.2.1) Installing tilt (1.3.3) Using sprockets (2.1.3) Using actionpack (3.2.3) Installing mime-types (1.18) Using polyglot (0.3.3) Using treetop (1.4.10) Installing mail (2.4.4) Using actionmailer (3.2.3) Using arel (3.0.2) Using tzinfo (0.3.33) Using activerecord (3.2.3) Using activeresource (3.2.3) Using acts_as_indexed (0.7.8) Using awesome_nested_set (2.1.3) Using babosa (0.3.7) Using bcrypt-ruby (3.0.1) Using coffee-script-source (1.3.3) Using execjs (1.4.0) Using coffee-script (2.2.0) Using rack-ssl (1.3.2) Using json (1.7.3) Using rdoc (3.12) Using thor (0.14.6) Using railties (3.2.3) Using coffee-rails (3.2.2) Using orm_adapter (0.0.7) Using warden (1.1.1) Using devise (2.0.4) Using dragonfly (0.9.12) Using friendly_id (4.0.6) Using paper_trail (2.6.3) Using globalize3 (0.2.0) Installing jquery-rails (2.0.2) Using bundler (1.1.4) Using rails (3.2.3) Using sass (3.1.19) Installing sass-rails (3.2.5) Using truncate_html (0.5.5) Using uglifier (1.2.4) Using will_paginate (3.0.3) Using refinerycms-core (2.0.4) Using refinerycms-authentication (2.0.4) Using refinerycms-dashboard (2.0.4) Using refinerycms-images (2.0.4) Using seo_meta (1.3.0) Using refinerycms-pages (2.0.4) Using refinerycms-resources (2.0.4) Using refinerycms (2.0.4) Using routing-filter (0.3.1) Using refinerycms-i18n (2.0.0) Using sqlite3 (1.3.6) Your bundle is complete! Use `bundle show [gemname]` to see where a bundled gem is installed. Obviously the errors with Invalid gemspec need to be resolved, but the other thing that's troubling to me are the lines: Using refinerycms-core (2.0.4) Using refinerycms-authentication (2.0.4) Using refinerycms-dashboard (2.0.4) Using refinerycms-images (2.0.4) Using seo_meta (1.3.0) Using refinerycms-pages (2.0.4) Using refinerycms-resources (2.0.4) Using refinerycms (2.0.4) Using routing-filter (0.3.1) Using refinerycms-i18n (2.0.0) Since the refinerycms version listed in the Gemfile was 0.9.6.34. When it comes to the Ruby world, I'm a bit lost so any help would be greatly appreciated. Thanks,

    Read the article

  • Oracle T4CPreparedStatement memory leaks?

    - by Jay
    A little background on the application that I am gonna talk about in the next few lines: XYZ is a data masking workbench eclipse RCP application: You give it a source table column, and a target table column, it would apply a trasformation (encryption/shuffling/etc) and copy the row data from source table to target table. Now, when I mask n tables at a time, n threads are launched by this app. Here is the issue: I have run into a production issue on first roll out of the above said app. Unfortunately, I don't have any logs to get to the root. However, I tried to run this app in test region and do a stress test. When I collected .hprof files and ran 'em through an analyzer (yourKit), I noticed that objects of oracle.jdbc.driver.T4CPreparedStatement was retaining heap. The analysis also tells me that one of my classes is holding a reference to this preparedstatement object and thereby, n threads have n such objects. T4CPreparedStatement seemed to have character arrays: lastBoundChars and bindChars each of size char[300000]. So, I researched a bit (google!), obtained ojdbc6.jar and tried decompiling T4CPreparedStatement. I see that T4CPreparedStatement extends OraclePreparedStatement, which dynamically manages array size of lastBoundChars and bindChars. So, my questions here are: Have you ever run into an issue like this? Do you know the significance of lastBoundChars / bindChars? I am new to profiling, so do you think I am not doing it correct? (I also ran the hprofs through MAT - and this was the main identified issue - so, I don't really think I could be wrong?) I have found something similar on the web here: http://forums.oracle.com/forums/thread.jspa?messageID=2860681 Appreciate your suggestions / advice.

    Read the article

  • Should programmers do Pro Bono work? where are the code public defenders?

    - by Tj Kellie
    How many projects are people doing based on the Bro Bono publico ideals versus working for the highest wage or potential for a cash-in-buy-out payday? For years lawyers have been called out for excessive gathering of wealth from high bill rates and huge settlement deals, hiring out their knowledge and skills to the highest bidders. People call for them to do more for free, use the laws and their time to defend or further some cause thats in the public's best interest. Is professional software development that different? So many bright people and so much knowledge of complex systems. Do you think that there is enough of a "Pro Bono" movement to solve the social and public problems in the industry right now? If so what are the examples to point to? OLPC? NOTE: Saying that open source software is the same as pro bono misses the point completely. I was looking for specific projects with a social context, not just group-sourcing for free software. Just because your not making anyone pay for your software does not mean its doing anyone any good. I'm not calling out manual enforcement of pro bono work for programmers, really just want some objective opinions and concrete examples of social-minded software/tech development projects like the One Laptop Per Child project. I'm sure open source would be a natural tie-in for some.

    Read the article

  • Getting Started with AJAX ToolKit Controls

    - by Aviran
    I am trying to make my first AJAX Control and I get error. I probbly missed some steps but I cann't find them eventhough I read many tutorials, probbly since I am new in AJAX, so I need to be guided step-by-step. These are the steps I've already done: Downloading AJAX ToolKit. Adding These Controls to the ToolBox. creating new ASP.NET Website (I heard about AJAX-Enabled Option, but I dont have this option) Adding a AJAX Tool. And thats it. I read that I need to register add AjaxControlToolkit.dll in application bin folder, but I dont know how to do that and I dont have Bin Folder in my website, only App_Data Folder. than I need to add this to the web config: <add tagPrefix="ajaxToolkit" namespace="AjaxControlToolkit" assembly="AjaxControlToolkit"/> than I need to add this to my website: <asp:ScriptManager ID="scriptmanager1" EnablePartialRendering="true" runat="Server" /> This is the error I receive: "Compilation Error Description: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately. Compiler Error Message: CS0012: The type 'System.Web.UI.ExtenderControl' is defined in an assembly that is not referenced. You must add a reference to assembly 'System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'." Source Error: Line 16: <br /> Line 17: <asp:Label ID="Label1" runat="server" Text="Label" Width="229px"></asp:Label><br /> Line 18: <asp:ConfirmButtonExtender ID="ConfirmButtonExtender1" runat="server" ConfirmText="are you sure" Line 19: TargetControlID="Button1"> Line 20: </asp:ConfirmButtonExtender> Does anyone know how can I solve this error?

    Read the article

  • Workflow engine BPMN, Drools, etc or ESB?

    - by Tom
    We currently have an application that is based on an in-house developed workflow engine with YAML based DSL. We are looking to move parts of it to Java. I have discovered a number of java solutions like Intalio, JBPM, Drools Expert, Drools Flow etc. They appear to be aimed at businesses where the business analyst creates the workflows using a graphical editor and submits them to the workflow engine. They seem geared towards ease of use for non-technical people rather than for developers with a focus on human interaction. The workflows tend to look like. Discover-a-file -\ -> join -> process-file -> move-file -> register-file Discover-some-metadata -/ If any step fails we need to retry it X times. We also need to be able to stop the system and be able to restart it and have it continue from where it was (durable). Some of our workflows can be defined by a set of goals we need to achieve so Jess's backwards rule chaining sounds interesting but it is not open source. It might be that what we are after is a Finite State Machine engine or just an Enterprise Service Bus and do everything as JMS queues. Is there a good open source workflow engine that is both standards-based but also geared towards developers. We don't particular want to use a graphical workflow designer or write reams of XML and it should ideally be in Java or language agnostic (makes REST/Soap calls to external services). Thanks, Tom

    Read the article

< Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >