Search Results

Search found 8800 results on 352 pages for 'import'.

Page 7/352 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • How to build a database from an XSD schema and import XML data

    - by FreshCode
    I have a complex XSD schema and hundreds of XML files conforming to the schema. How do I automate the creation of related SQL Server tables to store the XML data? I've considered creating C# classes from the XSD schema using the xsd.exe tool and letting something like Subsonic figure out how to make a shiny database out of it, but not sure if it's the best way to approach it. Has anyone managed to elegantly import XSD files into SQL Server?

    Read the article

  • setting up git on cygwin - openssl

    - by user23020
    I'm trying to get git running in cygwin on a windows 7 machine I have git unpacked and the directory git-1.7.1.1 when i run make install from within that directory, I get CC fast-import.o In file included from builtin.h:4, from fast-import.c:147: git-compat-util.h:136:19: iconv.h: No such file or directory git-compat-util.h:140:25: openssl/ssl.h: No such file or directory git-compat-util.h:141:25: openssl/err.h: No such file or directory In file included from builtin.h:6, from fast-import.c:147: cache.h:9:21: openssl/sha.h: No such file or directory In file included from fast-import.c:156: csum-file.h:10: error: parse error before "SHA_CTX" csum-file.h:10: warning: no semicolon at end of struct or union csum-file.h:15: error: 'crc32' redeclared as different kind of symbol /usr/include/zlib.h:1285: error: previous declaration of 'crc32' was here csum-file.h:15: error: 'crc32' redeclared as different kind of symbol /usr/include/zlib.h:1285: error: previous declaration of 'crc32' was here csum-file.h:17: error: parse error before '}' token fast-import.c: In function `store_object': fast-import.c:995: error: `SHA_CTX' undeclared (first use in this function) fast-import.c:995: error: (Each undeclared identifier is reported only once fast-import.c:995: error: for each function it appears in.) fast-import.c:995: error: parse error before "c" fast-import.c:1000: warning: implicit declaration of function `SHA1_Init' fast-import.c:1000: error: `c' undeclared (first use in this function) fast-import.c:1001: warning: implicit declaration of function `SHA1_Update' fast-import.c:1003: warning: implicit declaration of function `SHA1_Final' fast-import.c: At top level: fast-import.c:1118: error: parse error before "SHA_CTX" fast-import.c: In function `truncate_pack': fast-import.c:1120: error: `to' undeclared (first use in this function) fast-import.c:1126: error: dereferencing pointer to incomplete type fast-import.c:1127: error: dereferencing pointer to incomplete type fast-import.c:1128: error: dereferencing pointer to incomplete type fast-import.c:1128: error: `ctx' undeclared (first use in this function) fast-import.c: In function `stream_blob': fast-import.c:1140: error: `SHA_CTX' undeclared (first use in this function) fast-import.c:1140: error: parse error before "c" fast-import.c:1154: error: `pack_file_ctx' undeclared (first use in this functio n) fast-import.c:1154: error: dereferencing pointer to incomplete type fast-import.c:1160: error: `c' undeclared (first use in this function) make: *** [fast-import.o] Error 1 I'm guessing that most of these errors are due to the iconv.h and openssl files which apparently are missing, but I can't figure out how I'm supposed to install those (if I am), or if there is some other way to get around this.

    Read the article

  • setting up git on cygwin - openssl

    - by Pete Field
    I'm trying to get git running in cygwin on a windows 7 machine I have git unpacked and the directory git-1.7.1.1 when i run make install from within that directory, I get CC fast-import.o In file included from builtin.h:4, from fast-import.c:147: git-compat-util.h:136:19: iconv.h: No such file or directory git-compat-util.h:140:25: openssl/ssl.h: No such file or directory git-compat-util.h:141:25: openssl/err.h: No such file or directory In file included from builtin.h:6, from fast-import.c:147: cache.h:9:21: openssl/sha.h: No such file or directory In file included from fast-import.c:156: csum-file.h:10: error: parse error before "SHA_CTX" csum-file.h:10: warning: no semicolon at end of struct or union csum-file.h:15: error: 'crc32' redeclared as different kind of symbol /usr/include/zlib.h:1285: error: previous declaration of 'crc32' was here csum-file.h:15: error: 'crc32' redeclared as different kind of symbol /usr/include/zlib.h:1285: error: previous declaration of 'crc32' was here csum-file.h:17: error: parse error before '}' token fast-import.c: In function `store_object': fast-import.c:995: error: `SHA_CTX' undeclared (first use in this function) fast-import.c:995: error: (Each undeclared identifier is reported only once fast-import.c:995: error: for each function it appears in.) fast-import.c:995: error: parse error before "c" fast-import.c:1000: warning: implicit declaration of function `SHA1_Init' fast-import.c:1000: error: `c' undeclared (first use in this function) fast-import.c:1001: warning: implicit declaration of function `SHA1_Update' fast-import.c:1003: warning: implicit declaration of function `SHA1_Final' fast-import.c: At top level: fast-import.c:1118: error: parse error before "SHA_CTX" fast-import.c: In function `truncate_pack': fast-import.c:1120: error: `to' undeclared (first use in this function) fast-import.c:1126: error: dereferencing pointer to incomplete type fast-import.c:1127: error: dereferencing pointer to incomplete type fast-import.c:1128: error: dereferencing pointer to incomplete type fast-import.c:1128: error: `ctx' undeclared (first use in this function) fast-import.c: In function `stream_blob': fast-import.c:1140: error: `SHA_CTX' undeclared (first use in this function) fast-import.c:1140: error: parse error before "c" fast-import.c:1154: error: `pack_file_ctx' undeclared (first use in this functio n) fast-import.c:1154: error: dereferencing pointer to incomplete type fast-import.c:1160: error: `c' undeclared (first use in this function) make: *** [fast-import.o] Error 1 I'm guessing that most of these errors are due to the iconv.h and openssl files which apparently are missing, but I can't figure out how I'm supposed to install those (if I am), or if there is some other way to get around this.

    Read the article

  • Import xml to database with high end performance and Audit log- A best Practice

    - by karthik
    Hi, I have to import big xml files to Ms SQL 2005 Database by using C# with high end Performance. Even if any record fails in middle, i have to take next record for process and failed record need to log for audit. I don't want to put insert query with in for loop. Could you please suggest a best way to do this. If I can use bulkcopy methods or Data Adapter update methods- Its very nice, But if any record fails, execution of that statement breaks and rolled back totally, right? Any alternatives and Best practices with example please..? Is Multi-threading works for me to improve performance..? Give me example please. Thanks Karthikeyan

    Read the article

  • Import text file crunching library for Java/Groovy ?

    - by devdude
    In a lot of real life implementations of applications we face the requirement to import some kind of (text) files. Usually we would implement some (hardcoded?) logic to validate the file (eg. proper header, proper number of delimiters, proper date/time value,etc.). Eventually also need to check for the existence of related data in a table (eg. value of field 1 in text file must have an entry in some basic data table). While XML solves this (to some extend) with XSD and DTD, we end up hacking this again and again for proprietary text file formats. Is there any library or framework that allows the creation of templates similar to the xsd approach ? This would make it way more flexible to react on file format changes or implement new formats. Thanks for any hints Sven

    Read the article

  • EXCEL import to sql returning NULL for decimals when in VARCHAR data type

    - by Daniel
    Hi, I am working on a peice of software which has expodentially grown over the last few years and the database needs to be regularly updated. Customers are providing us with data now on large spreadsheets which we format and will start importing into the database. I am using the Import and Export Data (32-bit) Wizard. One column in the database contains values like '1.1.1.2' etc and i am importing them in as a Varchar as that is the data type in the database. However, for values like '8.5', 'NULL' is getting imported insead. It only occurs when there is one decimal point. Is this a formatting error with excel or is it the wrong datatype?

    Read the article

  • best way to import 500MB csv file into mysql database?

    - by mars
    I have a 500MB csv file that needs to be imported into my mysql database. I've made a php file where i can upload the csv file and it analyses the fields n stuff and does the actual importing. but it can only handle small fiels max 5mb. so that's a 100 files and actually pretty slow(uploading) is there another way? I have to repeat this process every month because the data in the file changes every month it's about 12 000 000 lines :D

    Read the article

  • export and import utf8 data in mysql: best practices

    - by ChrisRamakers
    We're often faced with the need to send a data file to one of our clients with data from the database he/she needs to translate. Most of the time this export is CSV or XLS. Most of the time we create a csv dump with phpmyadmin and get an xls file in return with the translated data. The problem is that most of the time the data is UTF8 and when the file is returned as xls each and every time we load the data into mysql again we end up with utf8 problems, characters not being displayed properly, etc ... We've already doublechecked everything in mysql from my.conf to column charactersets and everything is set correctly to UTF8. My question is not how to fix the encoding issue since that's been solved but how we would best proceed in the future handling this situation? What export format should we hand over? How should we import (just mysql load data infile or our own processing scripts). What is the general consensus on how to handle this situation? We would like to continue using excel if possible since that's the format almost everybody expects including our clients' translation agencies. Our clients' ease of use is the most important factor here, without overloading us with major issues each time. The best of both worlds :)

    Read the article

  • Import MySQL file in PHP

    - by Cudos
    I have a MySQL file which I want to import via PHP 5. In the name of user friendliness the user should not use tools like PHPmyadmin etc. Just hit a button and the file will get imported. I have already created code to upload the file to a location on the server. The file looks like this: INSERT INTO products VALUES ('', '0', '10', '', '1', 'be34112', '4536.jpg', '','','','0'); SET @master_id = LAST_INSERT_ID(); INSERT INTO products_description VALUES ('', '1', @master_id, '1', 'Kjole', '', 'beskrivelse', '2000', '25', 'kjole.xml', '', '', ''); INSERT INTO products_to_categories VALUES ('',@master_id,'5'); INSERT INTO products VALUES ('', @master_id, '10', '12', '1', 'be34112', '4536.jpg', '200','','','0'); SET @variant_id = LAST_INSERT_ID(); INSERT INTO products_description VALUES ('', '1', @variant_id, '1', 'Kjole', '', 'beskrivelse', '2000', '25', 'kjole.xml', '', '', ''); INSERT INTO options_to_products VALUES ('', @variant_id, '1', '1'); INSERT INTO options_to_products VALUES ('', @variant_id, '', '2'); INSERT INTO products VALUES ('', @master_id, '20', '17', '1', 'be34113', '4537.jpg', '200','','','0'); SET @variant_id = LAST_INSERT_ID(); INSERT INTO products_description VALUES ('', '1', @variant_id, '1', 'Kjole', '', 'beskrivelse æøå ÆØÅ & íjj´¨¨¨¨fdfd""', '3000', '25', 'kjole.xml', '', '', ''); INSERT INTO options_to_products VALUES ('', @variant_id, '1', ''); INSERT INTO options_to_products VALUES ('', @variant_id, '', '4');

    Read the article

  • MYOB Import "amount paid"

    - by php-b-grader
    I seem to have found an anomaly with MYOB (I've actually found many anomalies, this is just another one that is doing my head in...) I am generating a file with all invoices from the web system - no problems. If an invoice has 3 lines and the account is paid COD, I am having an problem e.g. "INV", "DATE" ... "AMOUNT", "INC TAX AMOUNT" ... "AMOUNT PAID" 8421, 12/06/2010 ... 60, 66 ... 66 8421, 12/06/2010 ... 120, 132 ... 132 8421, 12/06/2010 ... 96, 105.6 ... 105.6 8421, 12/06/2010 ... 84, 92.4 ... 92.4 When I import this file, the balance of the invoice is still outstanding and what it appears is the issue is that it is only importing the first line of "amount paid" ... so in other words, based on the above: Invoice 8421 is imported with 4 lines The total invoice amount is $396 The Amount paid (that is imported) is $66 The outstanding balance = $330 Surely the first line isn't expected to be: Inc tax Amount = $66 Amount Paid = $396 It seems completely illogical to me... am I doing something wrong or is MYOB just really bad?

    Read the article

  • Android NDK import-module / code reuse

    - by Graeme
    Morning! I've created a small NDK project which allows dynamic serialisation of objects between Java and C++ through JNI. The logic works like this: Bean - JavaCInterface.Java - JavaCInterface.cpp - JavaCInterface.java - Bean The problem is I want to use this functionality in other projects. I separated out the test code from the project and created a "Tester" project. The tester project sends a Java object through to C++ which then echo's it back to the Java layer. I thought linking would be pretty simple - ("Simple" in terms of NDK/JNI is usually a day of frustration) I added the JNIBridge project as a source project and including the following lines to Android.mk: NDK_MODULE_PATH=.../JNIBridge/jni/" JNIBridge/jni/JavaCInterface/Android.mk: ... include $(BUILD_STATIC_LIBRARY) JNITester/jni/Android.mk: ... include $(BUILD_SHARED_LIBRARY) $(call import-module, JavaCInterface) This all works fine. The C++ files which rely on headers from JavaCInterface module work fine. Also the Java classes can happily use interfaces from JNIBridge project. All the linking is happy. Unfortunately JavaCInterface.java which contains the native method calls cannot see the JNI method located in the static library. (Logically they are in the same project but both are imported into the project where you wish to use them through the above mechanism). My current solutions are are follows. I'm hoping someone can suggest something that will preserve the modular nature of what I'm trying to achieve: My current solution would be to include the JavaCInterface cpp files in the calling project like so: LOCAL_SRC_FILES := FunctionTable.cpp $(PATH_TO_SHARED_PROJECT)/JavaCInterface.cpp But I'd rather not do this as it would lead to me needing to update each depending project if I changed the JavaCInterface architecture. I could create a new set of JNI method signatures in each local project which then link to the imported modules. Again, this binds the implementations too tightly.

    Read the article

  • Best way to migrate export/import from SQL Server to oracle

    - by matao
    Hi guys! I'm faced with needing access for reporting to some data that lives in Oracle and other data that lives in a SQL Server 2000 database. For various reasons these live on different sides of a firewall. Now we're looking at doing an export/import from sql server to oracle and I'd like some advice on the best way to go about it... The procedure will need to be fully automated and run nightly, so that excludes using the SQL developer tools. I also can't make a live link between databases from our (oracle) side as the firewall is in the way. The data needs to be transformed in the process from a star schema to a de-normalised table ready for reporting. What I'm thinking about is writing a monster query for SQL Server (which I mostly have already) that will denormalise and read out the data from SQL Server into a flat file using the sql server equivalent of sqlplus as a scheduled task, dump into a Well Known Location, then on the oracle side have a cron job that copies down the file and loads it with sql loader and rebuilds indexes etc. This is all doable, but very manual. Is there one or a combination of FOSS or standard oracle/SQL Server tools that could automate this for me? the Irreducible complexity is the query on one side and building indexes on the other, but I would love to not have to write the CSV dumping detail or the SQL loader script, just say dump this view out to CSV on one side, and on the other truncate and insert into this table from CSV and not worry about mapping column names and all other arcane sqlldr voodoo... best practices? thoughts? comments? edit: I have about 50+ columns all of varying types and lengths in my dataset, which is why I'd prefer to not have to write out how to generate and map each single column...

    Read the article

  • importing same module more than once

    - by wallacoloo
    So after a few hours, I discovered the cause of a bug in my application. My app's source is structure like: main/ __init__.py folderA/ __init__.py fileA.py fileB.py Really, there are about 50 more files. But that's not the point. In main/__init__.py, I have this code: from folderA.fileA import * in folderA/__init__.py I have this code: sys.path.append(pathToFolderA) in folderA/fileB.py I have this code: from fileA import * The problem is that fileA gets imported twice. However, I only want to import it once. The obvious way to fix this (to me atleast) is to change certain paths from path to folderA.path But I feel like Python should not even have this error in the first place. What other workarounds are there that don't require each file to know it's absolute location?

    Read the article

  • "Cannot import name genshi" error when installing the Swab library

    - by ATMathew
    I'm trying to install the Swab library for Python 2.6 in Ubuntu 10.10. However, I get the following error messages when I try to import it. In the terminal I ran: sudo easy_install swab sudo easy_install Genshi In the Python interpreter I ran: >>> import swab Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python2.6/dist-packages/swab-0.1.2-py2.6.egg/swab/__init__.py", line 23, in <module> from pestotools.genshi import genshi, render_docstring ImportError: cannot import name genshi I don't know whats going on. can anyone help.

    Read the article

  • Import FBX with multiple meshes into UDK

    - by Tom
    I used this script to generate a few buildings that I was hoping to import into UDK. Each building is made of about 1000 separate objects. When I export a building as FBX and import the file into UDK it breaks it up into its individual objects again, so I was wondering how I would avoid this. Whether there was a tool to combine all of the objects into one mesh automatically before exporting or if I could prevent UDK from breaking them upon import.

    Read the article

  • How can I import a CSV file into an SQLite table that doesn't allow null values?

    - by Philip
    I've been using the SQLite Manager extension for Firefox to edit my Chrome web data file in order to restore my keyword searches, and I think I have everything in place except that when I import a CSV file into a table, (a) it won't import into the actual table because the table doesn't allow null values (b) if I import it into a new table that does allow null values, then all the empty strings end up as null and I have to manually edit the row, type and delete a character, and then it's set to empty string and it's fine So: is there either a way to import the CSV so that empty cells are automatically turned into empty strings instead of null, OR is there a way once a table is imported that has null values to convert it into one that doesn't allow null values, where each formerly null value is the empty string? Thanks!

    Read the article

  • import txt files using excel interop in C# (QueryTables.Add)

    - by kite
    Hi all, I am trying to insert text files into excel cell using Querytables.Add; no error, but the worksheet is empty. except for the single cell manipulation using Value2 property. I already using macro to record the object used. Can you help me on this(I am using vs2008, C# , excel 2003 and 2007; both shown empty cell). Below is my code; thanks for your help Application application = new ApplicationClass(); try { object misValue = Missing.Value; wbDoc = application.Workbooks.Open(flnmDoc, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue, misValue); wsRefDocBudgetOwner = (Worksheet)wbDoc.Worksheets[2]; Range lRange = wsRefDocBudgetOwner.get_Range("B2", "B25"); var temp2 = wsRefDocBudgetOwner.QueryTables; var temp = temp2.Add(@"TEXT;d:\temp\config ssas.txt", lRange, Type.Missing); //temp.RefreshStyle = XlCellInsertionMode.xlInsertDeleteCells; //temp.RefreshOnFileOpen = true; wsRefDocBudgetOwner.get_Range("B1", "B1").Value2 = "Lgfdgast adsffdafadfads"; wbDoc.Save(); //wbDoc.SaveAs(flnmDoc2, misValue, misValue, misValue, misValue, misValue, XlSaveAsAccessMode.xlExclusive, // misValue, misValue, misValue, misValue, misValue); wbDoc.Close(Missing.Value, Missing.Value, Missing.Value); } finally { application.Quit(); }

    Read the article

  • Import / include assigned variables in Jinja2

    - by Brian M. Hunt
    In Jinja2, how can one access assigned variables (i.e. {% set X=Y %}) within files incorporated with include? I'd expect the following to work given two Jinja2 files: A.jinja: Stuff {% include 'B.jinja' -%} B has {{ N }} references B.jinja: {% set N = 12 %} I'd expect that A.jinja, when compiled with Jinja2, would produce the following output: Stuff B has 12 references However, it produces: Stuff B has references I'd be much obliged for any input as to how to access the Jinja2 variables, such as N above, in the file that includes the file where N is set. Thank you for reading. Brian

    Read the article

  • How to import data in SQL Compact Edition?

    - by Peter
    I don't seem to find a tool for it, nor an odbc driver. Thanks UPDATE : I'm aware of the sql scripting possibilities. But than again : how to script a sql 2k table? (not just ddl, but data also?) Of course you can write this all by yourself, but importing data into CE cannot be such a hassle, or can it ? UPDATE2 : I don't seem to be able to choose the right dialect for inserting

    Read the article

  • Import XCode project inside another XCode project

    - by bruno
    I imported an XCode project inside another XCode project. I dragged and dropped project B inside project A like in How to Call Xcode Project In Another Xcode Project.......? Next, i imported a class from project B in project A, so i could use a method but i gave me an error "ClassTemp.h' file not found". From what i´ve read this should have worked. Do i have to do some kind of configuration for it to work?

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >