Search Results

Search found 11052 results on 443 pages for 'linked tables'.

Page 441/443 | < Previous Page | 437 438 439 440 441 442 443  | Next Page >

  • 12c - flashforward, flashback or see it as of now...

    - by noreply(at)blogger.com (Thomas Kyte)
    Oracle 9i exposed flashback query to developers for the first time.  The ability to flashback query dates back to version 4 however (it just wasn't exposed).  Every time you run a query in Oracle it is in fact a flashback query - it is what multi-versioning is all about.However, there was never a flashforward query (well, ok, the workspace manager has this capability - but with lots of extra baggage).  We've never been able to ask a table "what will you look like tomorrow" - but now we do.The capability is called Temporal Validity.  If you have a table with data that is effective dated - has a "start date" and "end date" column in it - we can now query it using flashback query like syntax.  The twist is - the date we "flashback" to can be in the future.  It works by rewriting the query to transparently the necessary where clause and filter out the right rows for the right period of time - and since you can have records whose start date is in the future - you can query a table and see what it would look like at some future time.Here is a quick example, we'll start with a table:ops$tkyte%ORA12CR1> create table addresses  2  ( empno       number,  3    addr_data   varchar2(30),  4    start_date  date,  5    end_date    date,  6    period for valid(start_date,end_date)  7  )  8  /Table created.the new bit is on line 6 (it can be altered into an existing table - so any table  you have with a start/end date column will be a candidate).  The keyword is PERIOD, valid is an identifier I chose - it could have been foobar, valid just sounds nice in the query later.  You identify the columns in your table - or we can create them for you if they don't exist.  Then you just create some data:ops$tkyte%ORA12CR1> insert into addresses (empno, addr_data, start_date, end_date )  2  values ( 1234, '123 Main Street', trunc(sysdate-5), trunc(sysdate-2) );1 row created.ops$tkyte%ORA12CR1>ops$tkyte%ORA12CR1> insert into addresses (empno, addr_data, start_date, end_date )  2  values ( 1234, '456 Fleet Street', trunc(sysdate-1), trunc(sysdate+1) );1 row created.ops$tkyte%ORA12CR1>ops$tkyte%ORA12CR1> insert into addresses (empno, addr_data, start_date, end_date )  2  values ( 1234, '789 1st Ave', trunc(sysdate+2), null );1 row created.and you can either see all of the data:ops$tkyte%ORA12CR1> select * from addresses;     EMPNO ADDR_DATA                      START_DAT END_DATE---------- ------------------------------ --------- ---------      1234 123 Main Street                27-JUN-13 30-JUN-13      1234 456 Fleet Street               01-JUL-13 03-JUL-13      1234 789 1st Ave                    04-JUL-13or query "as of" some point in time - as  you can see in the predicate section - it is just doing a query rewrite to automate the "where" filters:ops$tkyte%ORA12CR1> select * from addresses as of period for valid sysdate-3;     EMPNO ADDR_DATA                      START_DAT END_DATE---------- ------------------------------ --------- ---------      1234 123 Main Street                27-JUN-13 30-JUN-13ops$tkyte%ORA12CR1> @planops$tkyte%ORA12CR1> select * from table(dbms_xplan.display_cursor);PLAN_TABLE_OUTPUT-------------------------------------------------------------------------------SQL_ID  cthtvvm0dxvva, child number 0-------------------------------------select * from addresses as of period for valid sysdate-3Plan hash value: 3184888728-------------------------------------------------------------------------------| Id  | Operation         | Name      | Rows  | Bytes | Cost (%CPU)| Time     |-------------------------------------------------------------------------------|   0 | SELECT STATEMENT  |           |       |       |     3 (100)|          ||*  1 |  TABLE ACCESS FULL| ADDRESSES |     1 |    48 |     3   (0)| 00:00:01 |-------------------------------------------------------------------------------Predicate Information (identified by operation id):---------------------------------------------------   1 - filter((("T"."START_DATE" IS NULL OR              "T"."START_DATE"<=SYSDATE@!-3) AND ("T"."END_DATE" IS NULL OR              "T"."END_DATE">SYSDATE@!-3)))Note-----   - dynamic statistics used: dynamic sampling (level=2)24 rows selected.ops$tkyte%ORA12CR1> select * from addresses as of period for valid sysdate;     EMPNO ADDR_DATA                      START_DAT END_DATE---------- ------------------------------ --------- ---------      1234 456 Fleet Street               01-JUL-13 03-JUL-13ops$tkyte%ORA12CR1> @planops$tkyte%ORA12CR1> select * from table(dbms_xplan.display_cursor);PLAN_TABLE_OUTPUT-------------------------------------------------------------------------------SQL_ID  26ubyhw9hgk7z, child number 0-------------------------------------select * from addresses as of period for valid sysdatePlan hash value: 3184888728-------------------------------------------------------------------------------| Id  | Operation         | Name      | Rows  | Bytes | Cost (%CPU)| Time     |-------------------------------------------------------------------------------|   0 | SELECT STATEMENT  |           |       |       |     3 (100)|          ||*  1 |  TABLE ACCESS FULL| ADDRESSES |     1 |    48 |     3   (0)| 00:00:01 |-------------------------------------------------------------------------------Predicate Information (identified by operation id):---------------------------------------------------   1 - filter((("T"."START_DATE" IS NULL OR              "T"."START_DATE"<=SYSDATE@!) AND ("T"."END_DATE" IS NULL OR              "T"."END_DATE">SYSDATE@!)))Note-----   - dynamic statistics used: dynamic sampling (level=2)24 rows selected.ops$tkyte%ORA12CR1> select * from addresses as of period for valid sysdate+3;     EMPNO ADDR_DATA                      START_DAT END_DATE---------- ------------------------------ --------- ---------      1234 789 1st Ave                    04-JUL-13ops$tkyte%ORA12CR1> @planops$tkyte%ORA12CR1> select * from table(dbms_xplan.display_cursor);PLAN_TABLE_OUTPUT-------------------------------------------------------------------------------SQL_ID  36bq7shnhc888, child number 0-------------------------------------select * from addresses as of period for valid sysdate+3Plan hash value: 3184888728-------------------------------------------------------------------------------| Id  | Operation         | Name      | Rows  | Bytes | Cost (%CPU)| Time     |-------------------------------------------------------------------------------|   0 | SELECT STATEMENT  |           |       |       |     3 (100)|          ||*  1 |  TABLE ACCESS FULL| ADDRESSES |     1 |    48 |     3   (0)| 00:00:01 |-------------------------------------------------------------------------------Predicate Information (identified by operation id):---------------------------------------------------   1 - filter((("T"."START_DATE" IS NULL OR              "T"."START_DATE"<=SYSDATE@!+3) AND ("T"."END_DATE" IS NULL OR              "T"."END_DATE">SYSDATE@!+3)))Note-----   - dynamic statistics used: dynamic sampling (level=2)24 rows selected.All in all a nice, easy way to query effective dated information as of a point in time without a complex where clause.  You need to maintain the data - it isn't that a delete will turn into an update the end dates a record or anything - but if you have tables with start/end dates, this will make it much easier to query them.

    Read the article

  • Unable to install Xdebug

    - by burnt1ce
    I've registered xdebug in php.ini (as per http://xdebug.org/docs/install) but it's not showing up when i run "php -m" or when i get a test page to run "phpinfo()". I've just installed the latest version of XAMPP. Can anyone provide any suggestions in getting xdebug to show up? This is what i get when i run phpinfo(). **PHP Version 5.3.1** System Windows NT ANDREW_LAPTOP 5.1 build 2600 (Windows XP Professional Service Pack 3) i586 Build Date Nov 20 2009 17:20:57 Compiler MSVC6 (Visual C++ 6.0) Architecture x86 Configure Command cscript /nologo configure.js "--enable-snapshot-build" Server API Apache 2.0 Handler Virtual Directory Support enabled Configuration File (php.ini) Path no value Loaded Configuration File C:\xampp\php\php.ini Scan this dir for additional .ini files (none) Additional .ini files parsed (none) PHP API 20090626 PHP Extension 20090626 Zend Extension 220090626 Zend Extension Build API220090626,TS,VC6 PHP Extension Build API20090626,TS,VC6 Debug Build no Thread Safety enabled Zend Memory Manager enabled Zend Multibyte Support disabled IPv6 Support enabled Registered PHP Streams https, ftps, php, file, glob, data, http, ftp, compress.zlib, compress.bzip2, phar, zip Registered Stream Socket Transports tcp, udp, ssl, sslv3, sslv2, tls Registered Stream Filters convert.iconv.*, string.rot13, string.toupper, string.tolower, string.strip_tags, convert.*, consumed, dechunk, zlib.*, bzip2.* This program makes use of the Zend Scripting Language Engine: Zend Engine v2.3.0, Copyright (c) 1998-2009 Zend Technologies PHP Credits Configuration apache2handler Apache Version Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 Apache API Version 20051115 Server Administrator postmaster@localhost Hostname:Port localhost:80 Max Requests Per Child: 0 - Keep Alive: on - Max Per Connection: 100 Timeouts Connection: 300 - Keep-Alive: 5 Virtual Server No Server Root C:/xampp/apache Loaded Modules core mod_win32 mpm_winnt http_core mod_so mod_actions mod_alias mod_asis mod_auth_basic mod_auth_digest mod_authn_default mod_authn_file mod_authz_default mod_authz_groupfile mod_authz_host mod_authz_user mod_cgi mod_dav mod_dav_fs mod_dav_lock mod_dir mod_env mod_headers mod_include mod_info mod_isapi mod_log_config mod_mime mod_negotiation mod_rewrite mod_setenvif mod_ssl mod_status mod_autoindex_color mod_php5 mod_perl mod_apreq2 Directive Local Value Master Value engine 1 1 last_modified 0 0 xbithack 0 0 Apache Environment Variable Value MIBDIRS C:/xampp/php/extras/mibs MYSQL_HOME C:\xampp\mysql\bin OPENSSL_CONF C:/xampp/apache/bin/openssl.cnf PHP_PEAR_SYSCONF_DIR C:\xampp\php PHPRC C:\xampp\php TMP C:\xampp\tmp HTTP_HOST localhost HTTP_CONNECTION keep-alive HTTP_USER_AGENT Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.8 Safari/533.2 HTTP_CACHE_CONTROL max-age=0 HTTP_ACCEPT application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 HTTP_ACCEPT_ENCODING gzip,deflate,sdch HTTP_ACCEPT_LANGUAGE en-US,en;q=0.8 HTTP_ACCEPT_CHARSET ISO-8859-1,utf-8;q=0.7,*;q=0.3 PATH C:\Documents and Settings\Andrew\Local Settings\Application Data\Google\Chrome\Application;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;c:\Program Files\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files\Microsoft SQL Server\100\DTS\Binn\;C:\Program Files\QuickTime\QTSystem\;C:\Program Files\Common Files\DivX Shared\;C:\Program Files\WiTopia.Net\bin SystemRoot C:\WINDOWS COMSPEC C:\WINDOWS\system32\cmd.exe PATHEXT .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH WINDIR C:\WINDOWS SERVER_SIGNATURE <address>Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 Server at localhost Port 80</address> SERVER_SOFTWARE Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 SERVER_NAME localhost SERVER_ADDR 127.0.0.1 SERVER_PORT 80 REMOTE_ADDR 127.0.0.1 DOCUMENT_ROOT C:/xampp/htdocs SERVER_ADMIN postmaster@localhost SCRIPT_FILENAME C:/xampp/htdocs/test.php REMOTE_PORT 3275 GATEWAY_INTERFACE CGI/1.1 SERVER_PROTOCOL HTTP/1.1 REQUEST_METHOD GET QUERY_STRING no value REQUEST_URI /test.php SCRIPT_NAME /test.php HTTP Headers Information HTTP Request Headers HTTP Request GET /test.php HTTP/1.1 Host localhost Connection keep-alive User-Agent Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.8 Safari/533.2 Cache-Control max-age=0 Accept application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Encoding gzip,deflate,sdch Accept-Language en-US,en;q=0.8 Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.3 HTTP Response Headers X-Powered-By PHP/5.3.1 Keep-Alive timeout=5, max=80 Connection Keep-Alive Transfer-Encoding chunked Content-Type text/html bcmath BCMath support enabled Directive Local Value Master Value bcmath.scale 0 0 bz2 BZip2 Support Enabled Stream Wrapper support compress.bz2:// Stream Filter support bzip2.decompress, bzip2.compress BZip2 Version 1.0.5, 10-Dec-2007 calendar Calendar support enabled com_dotnet COM support enabled DCOM support disabled .Net support enabled Directive Local Value Master Value com.allow_dcom 0 0 com.autoregister_casesensitive 1 1 com.autoregister_typelib 0 0 com.autoregister_verbose 0 0 com.code_page no value no value com.typelib_file no value no value Core PHP Version 5.3.1 Directive Local Value Master Value allow_call_time_pass_reference On On allow_url_fopen On On allow_url_include Off Off always_populate_raw_post_data Off Off arg_separator.input & & arg_separator.output &amp; &amp; asp_tags Off Off auto_append_file no value no value auto_globals_jit On On auto_prepend_file no value no value browscap C:\xampp\php\extras\browscap.ini C:\xampp\php\extras\browscap.ini default_charset no value no value default_mimetype text/html text/html define_syslog_variables Off Off disable_classes no value no value disable_functions no value no value display_errors On On display_startup_errors On On doc_root no value no value docref_ext no value no value docref_root no value no value enable_dl On On error_append_string no value no value error_log no value no value error_prepend_string no value no value error_reporting 22519 22519 exit_on_timeout Off Off expose_php On On extension_dir C:\xampp\php\ext C:\xampp\php\ext file_uploads On On highlight.bg #FFFFFF #FFFFFF highlight.comment #FF8000 #FF8000 highlight.default #0000BB #0000BB highlight.html #000000 #000000 highlight.keyword #007700 #007700 highlight.string #DD0000 #DD0000 html_errors On On ignore_repeated_errors Off Off ignore_repeated_source Off Off ignore_user_abort Off Off implicit_flush Off Off include_path .;C:\xampp\php\PEAR .;C:\xampp\php\PEAR log_errors Off Off log_errors_max_len 1024 1024 magic_quotes_gpc Off Off magic_quotes_runtime Off Off magic_quotes_sybase Off Off mail.add_x_header Off Off mail.force_extra_parameters no value no value mail.log no value no value max_execution_time 60 60 max_file_uploads 20 20 max_input_nesting_level 64 64 max_input_time 60 60 memory_limit 128M 128M open_basedir no value no value output_buffering no value no value output_handler no value no value post_max_size 128M 128M precision 14 14 realpath_cache_size 16K 16K realpath_cache_ttl 120 120 register_argc_argv On On register_globals Off Off register_long_arrays Off Off report_memleaks On On report_zend_debug On On request_order no value no value safe_mode Off Off safe_mode_exec_dir no value no value safe_mode_gid Off Off safe_mode_include_dir no value no value sendmail_from no value no value sendmail_path no value no value serialize_precision 100 100 short_open_tag Off Off SMTP localhost localhost smtp_port 25 25 sql.safe_mode Off Off track_errors Off Off unserialize_callback_func no value no value upload_max_filesize 128M 128M upload_tmp_dir C:\xampp\tmp C:\xampp\tmp user_dir no value no value user_ini.cache_ttl 300 300 user_ini.filename .user.ini .user.ini variables_order GPCS GPCS xmlrpc_error_number 0 0 xmlrpc_errors Off Off y2k_compliance On On zend.enable_gc On On ctype ctype functions enabled date date/time support enabled "Olson" Timezone Database Version 2009.18 Timezone Database internal Default timezone America/New_York Directive Local Value Master Value date.default_latitude 31.7667 31.7667 date.default_longitude 35.2333 35.2333 date.sunrise_zenith 90.583333 90.583333 date.sunset_zenith 90.583333 90.583333 date.timezone America/New_York America/New_York dom DOM/XML enabled DOM/XML API Version 20031129 libxml Version 2.7.6 HTML Support enabled XPath Support enabled XPointer Support enabled Schema Support enabled RelaxNG Support enabled ereg Regex Library System library enabled exif EXIF Support enabled EXIF Version 1.4 $Id: exif.c 287372 2009-08-16 14:32:32Z iliaa $ Supported EXIF Version 0220 Supported filetypes JPEG,TIFF Directive Local Value Master Value exif.decode_jis_intel JIS JIS exif.decode_jis_motorola JIS JIS exif.decode_unicode_intel UCS-2LE UCS-2LE exif.decode_unicode_motorola UCS-2BE UCS-2BE exif.encode_jis no value no value exif.encode_unicode ISO-8859-15 ISO-8859-15 fileinfo fileinfo support enabled version 1.0.5-dev filter Input Validation and Filtering enabled Revision $Revision: 289434 $ Directive Local Value Master Value filter.default unsafe_raw unsafe_raw filter.default_flags no value no value ftp FTP support enabled gd GD Support enabled GD Version bundled (2.0.34 compatible) FreeType Support enabled FreeType Linkage with freetype FreeType Version 2.3.11 T1Lib Support enabled GIF Read Support enabled GIF Create Support enabled JPEG Support enabled libJPEG Version 7 PNG Support enabled libPNG Version 1.2.40 WBMP Support enabled XBM Support enabled JIS-mapped Japanese Font Support enabled Directive Local Value Master Value gd.jpeg_ignore_warning 0 0 gettext GetText Support enabled hash hash support enabled Hashing Engines md2 md4 md5 sha1 sha224 sha256 sha384 sha512 ripemd128 ripemd160 ripemd256 ripemd320 whirlpool tiger128,3 tiger160,3 tiger192,3 tiger128,4 tiger160,4 tiger192,4 snefru snefru256 gost adler32 crc32 crc32b salsa10 salsa20 haval128,3 haval160,3 haval192,3 haval224,3 haval256,3 haval128,4 haval160,4 haval192,4 haval224,4 haval256,4 haval128,5 haval160,5 haval192,5 haval224,5 haval256,5 iconv iconv support enabled iconv implementation "libiconv" iconv library version 1.13 Directive Local Value Master Value iconv.input_encoding ISO-8859-1 ISO-8859-1 iconv.internal_encoding ISO-8859-1 ISO-8859-1 iconv.output_encoding ISO-8859-1 ISO-8859-1 imap IMAP c-Client Version 2007e SSL Support enabled json json support enabled json version 1.2.1 libxml libXML support active libXML Compiled Version 2.7.6 libXML Loaded Version 20706 libXML streams enabled mbstring Multibyte Support enabled Multibyte string engine libmbfl HTTP input encoding translation disabled mbstring extension makes use of "streamable kanji code filter and converter", which is distributed under the GNU Lesser General Public License version 2.1. Multibyte (japanese) regex support enabled Multibyte regex (oniguruma) version 4.7.1 Directive Local Value Master Value mbstring.detect_order no value no value mbstring.encoding_translation Off Off mbstring.func_overload 0 0 mbstring.http_input pass pass mbstring.http_output pass pass mbstring.http_output_conv_mimetypes ^(text/|application/xhtml\+xml) ^(text/|application/xhtml\+xml) mbstring.internal_encoding no value no value mbstring.language neutral neutral mbstring.strict_detection Off Off mbstring.substitute_character no value no value mcrypt mcrypt support enabled Version 2.5.8 Api No 20021217 Supported ciphers cast-128 gost rijndael-128 twofish arcfour cast-256 loki97 rijndael-192 saferplus wake blowfish-compat des rijndael-256 serpent xtea blowfish enigma rc2 tripledes Supported modes cbc cfb ctr ecb ncfb nofb ofb stream Directive Local Value Master Value mcrypt.algorithms_dir no value no value mcrypt.modes_dir no value no value mhash MHASH support Enabled MHASH API Version Emulated Support ming Ming SWF output library enabled Version 0.4.3 mysql MySQL Support enabled Active Persistent Links 0 Active Links 0 Client API version 5.1.41 Directive Local Value Master Value mysql.allow_local_infile On On mysql.allow_persistent On On mysql.connect_timeout 60 60 mysql.default_host no value no value mysql.default_password no value no value mysql.default_port 3306 3306 mysql.default_socket MySQL MySQL mysql.default_user no value no value mysql.max_links Unlimited Unlimited mysql.max_persistent Unlimited Unlimited mysql.trace_mode Off Off mysqli MysqlI Support enabled Client API library version 5.1.41 Active Persistent Links 0 Inactive Persistent Links 0 Active Links 0 Client API header version 5.1.41 MYSQLI_SOCKET MySQL Directive Local Value Master Value mysqli.allow_local_infile On On mysqli.allow_persistent On On mysqli.default_host no value no value mysqli.default_port 3306 3306 mysqli.default_pw no value no value mysqli.default_socket MySQL MySQL mysqli.default_user no value no value mysqli.max_links Unlimited Unlimited mysqli.max_persistent Unlimited Unlimited mysqli.reconnect Off Off mysqlnd mysqlnd enabled Version mysqlnd 5.0.5-dev - 081106 - $Revision: 289630 $ Command buffer size 4096 Read buffer size 32768 Read timeout 31536000 Collecting statistics Yes Collecting memory statistics No Client statistics bytes_sent 0 bytes_received 0 packets_sent 0 packets_received 0 protocol_overhead_in 0 protocol_overhead_out 0 bytes_received_ok_packet 0 bytes_received_eof_packet 0 bytes_received_rset_header_packet 0 bytes_received_rset_field_meta_packet 0 bytes_received_rset_row_packet 0 bytes_received_prepare_response_packet 0 bytes_received_change_user_packet 0 packets_sent_command 0 packets_received_ok 0 packets_received_eof 0 packets_received_rset_header 0 packets_received_rset_field_meta 0 packets_received_rset_row 0 packets_received_prepare_response 0 packets_received_change_user 0 result_set_queries 0 non_result_set_queries 0 no_index_used 0 bad_index_used 0 slow_queries 0 buffered_sets 0 unbuffered_sets 0 ps_buffered_sets 0 ps_unbuffered_sets 0 flushed_normal_sets 0 flushed_ps_sets 0 ps_prepared_never_executed 0 ps_prepared_once_executed 0 rows_fetched_from_server_normal 0 rows_fetched_from_server_ps 0 rows_buffered_from_client_normal 0 rows_buffered_from_client_ps 0 rows_fetched_from_client_normal_buffered 0 rows_fetched_from_client_normal_unbuffered 0 rows_fetched_from_client_ps_buffered 0 rows_fetched_from_client_ps_unbuffered 0 rows_fetched_from_client_ps_cursor 0 rows_skipped_normal 0 rows_skipped_ps 0 copy_on_write_saved 0 copy_on_write_performed 0 command_buffer_too_small 0 connect_success 0 connect_failure 0 connection_reused 0 reconnect 0 pconnect_success 0 active_connections 0 active_persistent_connections 0 explicit_close 0 implicit_close 0 disconnect_close 0 in_middle_of_command_close 0 explicit_free_result 0 implicit_free_result 0 explicit_stmt_close 0 implicit_stmt_close 0 mem_emalloc_count 0 mem_emalloc_ammount 0 mem_ecalloc_count 0 mem_ecalloc_ammount 0 mem_erealloc_count 0 mem_erealloc_ammount 0 mem_efree_count 0 mem_malloc_count 0 mem_malloc_ammount 0 mem_calloc_count 0 mem_calloc_ammount 0 mem_realloc_count 0 mem_realloc_ammount 0 mem_free_count 0 proto_text_fetched_null 0 proto_text_fetched_bit 0 proto_text_fetched_tinyint 0 proto_text_fetched_short 0 proto_text_fetched_int24 0 proto_text_fetched_int 0 proto_text_fetched_bigint 0 proto_text_fetched_decimal 0 proto_text_fetched_float 0 proto_text_fetched_double 0 proto_text_fetched_date 0 proto_text_fetched_year 0 proto_text_fetched_time 0 proto_text_fetched_datetime 0 proto_text_fetched_timestamp 0 proto_text_fetched_string 0 proto_text_fetched_blob 0 proto_text_fetched_enum 0 proto_text_fetched_set 0 proto_text_fetched_geometry 0 proto_text_fetched_other 0 proto_binary_fetched_null 0 proto_binary_fetched_bit 0 proto_binary_fetched_tinyint 0 proto_binary_fetched_short 0 proto_binary_fetched_int24 0 proto_binary_fetched_int 0 proto_binary_fetched_bigint 0 proto_binary_fetched_decimal 0 proto_binary_fetched_float 0 proto_binary_fetched_double 0 proto_binary_fetched_date 0 proto_binary_fetched_year 0 proto_binary_fetched_time 0 proto_binary_fetched_datetime 0 proto_binary_fetched_timestamp 0 proto_binary_fetched_string 0 proto_binary_fetched_blob 0 proto_binary_fetched_enum 0 proto_binary_fetched_set 0 proto_binary_fetched_geometry 0 proto_binary_fetched_other 0 init_command_executed_count 0 init_command_failed_count 0 odbc ODBC Support enabled Active Persistent Links 0 Active Links 0 ODBC library Win32 Directive Local Value Master Value odbc.allow_persistent On On odbc.check_persistent On On odbc.default_cursortype Static cursor Static cursor odbc.default_db no value no value odbc.default_pw no value no value odbc.default_user no value no value odbc.defaultbinmode return as is return as is odbc.defaultlrl return up to 4096 bytes return up to 4096 bytes odbc.max_links Unlimited Unlimited odbc.max_persistent Unlimited Unlimited openssl OpenSSL support enabled OpenSSL Library Version OpenSSL 0.9.8l 5 Nov 2009 OpenSSL Header Version OpenSSL 0.9.8l 5 Nov 2009 pcre PCRE (Perl Compatible Regular Expressions) Support enabled PCRE Library Version 8.00 2009-10-19 Directive Local Value Master Value pcre.backtrack_limit 100000 100000 pcre.recursion_limit 100000 100000 pdf PDF Support enabled PDFlib GmbH Version 7.0.4p4 PECL Version 2.1.6 Revision $Revision: 277110 $ PDO PDO support enabled PDO drivers mysql, odbc, sqlite, sqlite2 pdo_mysql PDO Driver for MySQL enabled Client API version 5.1.41 PDO_ODBC PDO Driver for ODBC (Win32) enabled ODBC Connection Pooling Enabled, strict matching pdo_sqlite PDO Driver for SQLite 3.x enabled SQLite Library 3.6.20 Phar Phar: PHP Archive support enabled Phar EXT version 2.0.1 Phar API version 1.1.1 CVS revision $Revision: 286338 $ Phar-based phar archives enabled Tar-based phar archives enabled ZIP-based phar archives enabled gzip compression enabled bzip2 compression enabled Native OpenSSL support enabled Phar based on pear/PHP_Archive, original concept by Davey Shafik. Phar fully realized by Gregory Beaver and Marcus Boerger. Portions of tar implementation Copyright (c) 2003-2009 Tim Kientzle. Directive Local Value Master Value phar.cache_list no value no value phar.readonly On On phar.require_hash On On Reflection Reflection enabled Version $Revision: 287991 $ session Session Support enabled Registered save handlers files user sqlite Registered serializer handlers php php_binary wddx Directive Local Value Master Value session.auto_start Off Off session.bug_compat_42 On On session.bug_compat_warn On On session.cache_expire 180 180 session.cache_limiter nocache nocache session.cookie_domain no value no value session.cookie_httponly Off Off session.cookie_lifetime 0 0 session.cookie_path / / session.cookie_secure Off Off session.entropy_file no value no value session.entropy_length 0 0 session.gc_divisor 100 100 session.gc_maxlifetime 1440 1440 session.gc_probability 1 1 session.hash_bits_per_character 5 5 session.hash_function 0 0 session.name PHPSESSID PHPSESSID session.referer_check no value no value session.save_handler files files session.save_path C:\xampp\tmp C:\xampp\tmp session.serialize_handler php php session.use_cookies On On session.use_only_cookies Off Off session.use_trans_sid 0 0 SimpleXML Simplexml support enabled Revision $Revision: 281953 $ Schema support enabled soap Soap Client enabled Soap Server enabled Directive Local Value Master Value soap.wsdl_cache 1 1 soap.wsdl_cache_dir /tmp /tmp soap.wsdl_cache_enabled 1 1 soap.wsdl_cache_limit 5 5 soap.wsdl_cache_ttl 86400 86400 sockets Sockets Support enabled SPL SPL support enabled Interfaces Countable, OuterIterator, RecursiveIterator, SeekableIterator, SplObserver, SplSubject Classes AppendIterator, ArrayIterator, ArrayObject, BadFunctionCallException, BadMethodCallException, CachingIterator, DirectoryIterator, DomainException, EmptyIterator, FilesystemIterator, FilterIterator, GlobIterator, InfiniteIterator, InvalidArgumentException, IteratorIterator, LengthException, LimitIterator, LogicException, MultipleIterator, NoRewindIterator, OutOfBoundsException, OutOfRangeException, OverflowException, ParentIterator, RangeException, RecursiveArrayIterator, RecursiveCachingIterator, RecursiveDirectoryIterator, RecursiveFilterIterator, RecursiveIteratorIterator, RecursiveRegexIterator, RecursiveTreeIterator, RegexIterator, RuntimeException, SplDoublyLinkedList, SplFileInfo, SplFileObject, SplFixedArray, SplHeap, SplMinHeap, SplMaxHeap, SplObjectStorage, SplPriorityQueue, SplQueue, SplStack, SplTempFileObject, UnderflowException, UnexpectedValueException SQLite SQLite support enabled PECL Module version 2.0-dev $Id: sqlite.c 289598 2009-10-12 22:37:52Z pajoye $ SQLite Library 2.8.17 SQLite Encoding iso8859 Directive Local Value Master Value sqlite.assoc_case 0 0 sqlite3 SQLite3 support enabled SQLite3 module version 0.7-dev SQLite Library 3.6.20 Directive Local Value Master Value sqlite3.extension_dir no value no value standard Dynamic Library Support enabled Internal Sendmail Support for Windows enabled Directive Local Value Master Value assert.active 1 1 assert.bail 0 0 assert.callback no value no value assert.quiet_eval 0 0 assert.warning 1 1 auto_detect_line_endings 0 0 default_socket_timeout 60 60 safe_mode_allowed_env_vars PHP_ PHP_ safe_mode_protected_env_vars LD_LIBRARY_PATH LD_LIBRARY_PATH url_rewriter.tags a=href,area=href,frame=src,input=src,form=,fieldset= a=href,area=href,frame=src,input=src,form=,fieldset= user_agent no value no value tokenizer Tokenizer Support enabled wddx WDDX Support enabled WDDX Session Serializer enabled xml XML Support active XML Namespace Support active libxml2 Version 2.7.6 xmlreader XMLReader enabled xmlrpc core library version xmlrpc-epi v. 0.54 php extension version 0.51 author Dan Libby homepage http://xmlrpc-epi.sourceforge.net open sourced by Epinions.com xmlwriter XMLWriter enabled xsl XSL enabled libxslt Version 1.1.26 libxslt compiled against libxml Version 2.7.6 EXSLT enabled libexslt Version 1.1.26 zip Zip enabled Extension Version $Id: php_zip.c 276389 2009-02-24 23:55:14Z iliaa $ Zip version 1.9.1 Libzip version 0.9.0 zlib ZLib Support enabled Stream Wrapper support compress.zlib:// Stream Filter support zlib.inflate, zlib.deflate Compiled Version 1.2.3 Linked Version 1.2.3 Directive Local Value Master Value zlib.output_compression Off Off zlib.output_compression_level -1 -1 zlib.output_handler no value no value Additional Modules Module Name Environment Variable Value no value ::=::\ no value C:=C:\xampp ALLUSERSPROFILE C:\Documents and Settings\All Users APPDATA C:\Documents and Settings\Andrew\Application Data CHROME_RESTART Google Chrome|Whoa! Google Chrome has crashed. Restart now?|LEFT_TO_RIGHT CHROME_VERSION 5.0.342.8 CLASSPATH .;C:\Program Files\QuickTime\QTSystem\QTJava.zip CommonProgramFiles C:\Program Files\Common Files COMPUTERNAME ANDREW_LAPTOP ComSpec C:\WINDOWS\system32\cmd.exe FP_NO_HOST_CHECK NO HOMEDRIVE C: HOMEPATH \Documents and Settings\Andrew LOGONSERVER \\ANDREW_LAPTOP NUMBER_OF_PROCESSORS 2 OS Windows_NT PATH C:\Documents and Settings\Andrew\Local Settings\Application Data\Google\Chrome\Application;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;c:\Program Files\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files\Microsoft SQL Server\100\DTS\Binn\;C:\Program Files\QuickTime\QTSystem\;C:\Program Files\Common Files\DivX Shared\;C:\Program Files\WiTopia.Net\bin PATHEXT .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH PROCESSOR_ARCHITECTURE x86 PROCESSOR_IDENTIFIER x86 Family 6 Model 15 Stepping 10, GenuineIntel PROCESSOR_LEVEL 6 PROCESSOR_REVISION 0f0a ProgramFiles C:\Program Files PROMPT $P$G QTJAVA C:\Program Files\QuickTime\QTSystem\QTJava.zip SESSIONNAME Console sfxcmd "C:\Documents and Settings\Andrew\My Documents\Downloads\xampp-win32-1.7.3.exe" sfxname C:\Documents and Settings\Andrew\My Documents\Downloads\xampp-win32-1.7.3.exe SystemDrive C: SystemRoot C:\WINDOWS TEMP C:\DOCUME~1\Andrew\LOCALS~1\Temp TMP C:\DOCUME~1\Andrew\LOCALS~1\Temp USERDOMAIN ANDREW_LAPTOP USERNAME Andrew USERPROFILE C:\Documents and Settings\Andrew VS100COMNTOOLS C:\Program Files\Microsoft Visual Studio 10.0\Common7\Tools\ windir C:\WINDOWS AP_PARENT_PID 2216 PHP Variables Variable Value _SERVER["MIBDIRS"] C:/xampp/php/extras/mibs _SERVER["MYSQL_HOME"] C:\xampp\mysql\bin _SERVER["OPENSSL_CONF"] C:/xampp/apache/bin/openssl.cnf _SERVER["PHP_PEAR_SYSCONF_DIR"] C:\xampp\php _SERVER["PHPRC"] C:\xampp\php _SERVER["TMP"] C:\xampp\tmp _SERVER["HTTP_HOST"] localhost _SERVER["HTTP_CONNECTION"] keep-alive _SERVER["HTTP_USER_AGENT"] Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.8 Safari/533.2 _SERVER["HTTP_CACHE_CONTROL"] max-age=0 _SERVER["HTTP_ACCEPT"] application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 _SERVER["HTTP_ACCEPT_ENCODING"] gzip,deflate,sdch _SERVER["HTTP_ACCEPT_LANGUAGE"] en-US,en;q=0.8 _SERVER["HTTP_ACCEPT_CHARSET"] ISO-8859-1,utf-8;q=0.7,*;q=0.3 _SERVER["PATH"] C:\Documents and Settings\Andrew\Local Settings\Application Data\Google\Chrome\Application;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;c:\Program Files\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files\Microsoft SQL Server\100\DTS\Binn\;C:\Program Files\QuickTime\QTSystem\;C:\Program Files\Common Files\DivX Shared\;C:\Program Files\WiTopia.Net\bin _SERVER["SystemRoot"] C:\WINDOWS _SERVER["COMSPEC"] C:\WINDOWS\system32\cmd.exe _SERVER["PATHEXT"] .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH _SERVER["WINDIR"] C:\WINDOWS _SERVER["SERVER_SIGNATURE"] <address>Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 Server at localhost Port 80</address> _SERVER["SERVER_SOFTWARE"] Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 _SERVER["SERVER_NAME"] localhost _SERVER["SERVER_ADDR"] 127.0.0.1 _SERVER["SERVER_PORT"] 80 _SERVER["REMOTE_ADDR"] 127.0.0.1 _SERVER["DOCUMENT_ROOT"] C:/xampp/htdocs _SERVER["SERVER_ADMIN"] postmaster@localhost _SERVER["SCRIPT_FILENAME"] C:/xampp/htdocs/test.php _SERVER["REMOTE_PORT"] 3275 _SERVER["GATEWAY_INTERFACE"] CGI/1.1 _SERVER["SERVER_PROTOCOL"] HTTP/1.1 _SERVER["REQUEST_METHOD"] GET _SERVER["QUERY_STRING"] no value _SERVER["REQUEST_URI"] /test.php _SERVER["SCRIPT_NAME"] /test.php _SERVER["PHP_SELF"] /test.php _SERVER["REQUEST_TIME"] 1270600868 _SERVER["argv"] Array ( ) _SERVER["argc"] 0 PHP License This program is free software; you can redistribute it and/or modify it under the terms of the PHP License as published by the PHP Group and included in the distribution in the file: LICENSE This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. If you did not receive a copy of the PHP license, or have any questions about PHP licensing, please contact [email protected].

    Read the article

  • Passing variables to shopping cart with Javascript

    - by albatross
    This question is an extension of this one: http://stackoverflow.com/questions/2359238/calculate-order-price-by-date-selection-value I'm trying to make a conference registration page based off the previous page, which passes the variables(name, email, price) to my organization's outdated shopping cart using javascript. I'm also using Seminar Registration by CSSTricks (http://css-tricks.com/examples/SeminarRegTutorial/) Currently, my proceed to payment button produces an 'element is undefined' error on line 298(same thing on unresolved previous question, linked above^): switch (document.Information.amount.value) { Any help would be greatly appreciated. I'm at my wits end with this. Here is the page: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <title>Seminar Registration Form with jQuery</title> <link rel="stylesheet" type="text/css" href="css/style.css" media="screen" /> <script src="js/jquery-1.2.6.js" type="text/javascript" charset="utf-8"></script> <script src="js/form-fun.jquery.js" type="text/javascript" charset="utf-8"></script> <!--[if IE]> <style type="text/css"> legend { position: relative; top: -30px; } fieldset { margin: 30px 10px 0 0; } </style> <script type="text/javascript"> $(function(){ $("#step_2 legend").css({ opacity: 0.5 }); $("#step_3 legend").css({ opacity: 0.5 }); }); </script> <![endif]--> </head> <body> <div id="page-wrap"> <h1>Conference <span>Registration</span></h1> <form action="#" method="post"> <fieldset id="step_1"> <legend>Step 1</legend> <label for="num_attendees"> How cool are you? </label> <select id="amount"> <option id="0" value="0">Please Choose</option> <option id="prof" value="90.00">Professional</option> <option id="grad" value="55.00">Graduate Student</option> </select> <br /> <div id="attendee_1_wrap" class="name_wrap push"> <h3>Who are you?</h3> <p> <label for="FirstName"> First Name: </label> <input type="text" id="FirstName" class="name_input"></input> </p> <p> <label for="LastName"> Last Name: </label> <input type="text" id="LastName" class="name_input"></input> </p> <p> <label for="OfficialTitle"> Official Title: </label> <input type="text" id="OfficialTitle" class="name_input"></input> </p> <h3>How do we find you?</h3> <label for="email">Email: </label> <input id="email" name="email" class="required email" /> </p> <p> <label for="Address">Street Address: </label><input name="Address" id="Address" type="text" size="20" maxlength="75" /> </p> <p> <label for="City">City: </label><input name="City" id="City" /> </p> <p> <label for="State">State: </label><select name="State" id="State"> <option selected value="IL">IL</option> <option value="AL">AL</option> <option value="AK">AK</option> <option value="AZ">AZ</option> <option value="AR">AR</option> <option value="CA">CA</option> <option value="CO">CO</option> <option value="CT">CT</option> <option value="DE">DE</option> <option value="DC">DC</option> <option value="FL">FL</option> <option value="GA">GA</option> <option value="HI">HI</option> <option value="ID">ID</option> <option value="IN">IN</option> <option value="IA">IA</option> <option value="KS">KS</option> <option value="KY">KY</option> <option value="LA">LA</option> <option value="ME">ME</option> <option value="MD">MD</option> <option value="MA">MA</option> <option value="MI">MI</option> <option value="MN">MN</option> <option value="MS">MS</option> <option value="MO">MO</option> <option value="MT">MT</option> <option value="NE">NE</option> <option value="NV">NV</option> <option value="NH">NH</option> <option value="NJ">NJ</option> <option value="NM">NM</option> <option value="NY">NY</option> <option value="NC">NC</option> <option value="ND">ND</option> <option value="OH">OH</option> <option value="OK">OK</option> <option value="OR">OR</option> <option value="PA">PA</option> <option value="RI">RI</option> <option value="SC">SC</option> <option value="SD">SD</option> <option value="TN">TN</option> <option value="TX">TX</option> <option value="UT">UT</option> <option value="VT">VT</option> <option value="VA">VA</option> <option value="WA">WA</option> <option value="WV">WV</option> <option value="WI">WI</option> <option value="WY">WY</option> </select> </p> <p> <label for="Zip">Zip Code: </label><input name="Zip" id="Zip" type="text" value="" size="5" maxlength="10" /> </p> <p> <label for="Phone">Telephone: </label><input name="Phone" id="Phone" type="text" value="" size="10" maxlength="13" /> </p> </div> </fieldset> <fieldset id="step_2"> <legend>Step 2</legend> <p> Do you work in Higher Education? </p> <input type="radio" id="company_name_toggle_on" name="company_name_toggle_group"></input> <label for="company_name_toggle_on">Yes</label> &emsp; <input type="radio" id="company_name_toggle_off" name="company_name_toggle_group"></input> <label for="company_name_toggle_off">No</label> <div id="company_name_wrap"> <label for="company_name"> Which School? </label> <input type="text" id="company_name"></input> </div> <div class="push"> <p> Will anyone in your group require special accommodations? </p> <input type="radio" id="special_accommodations_toggle_on" name="special_accommodations_toggle"></input> <label for="special_accommodations_toggle_on">Yes</label> &emsp; <input type="radio" id="special_accommodations_toggle_off" name="special_accommodations_toggle"></input> <label for="special_accommodations_toggle_off">No</label> </div> <div id="special_accommodations_wrap"> <label for="special_accomodations_text"> Please explain below: </label> <textarea rows="10" cols="10" id="special_accomodations_text"></textarea> </div> </fieldset> <fieldset id="step_3"> <legend>Step 3</legend> <label for="rock"> Are you ready to rock? </label> <input type="checkbox" id="rock"></input> <p> <INPUT onclick="javascript:PaymentButtonClick()" type=button value="Proceed to payment" name=PaymentButton> <img src="images/visa1.gif" /> <img src="images/mastercard1.gif" /> </p> </fieldset> </form> </div> <FORM name="emailForm" action="mailform.asp" method=post"> <INPUT type="hidden" value="Conference Registration" name="mf_subject"> <INPUT type="hidden" value="Yes" name="mf_email_results"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="20" name="num_attendees"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="17" name="FirstName"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="22" name="LastName"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffff" size="64" name="OfficialTitle"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffff" size="40" name="email"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffff" size="48" name="Address"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="17" name="City"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="17" name="State"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="17" name="Zip"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="17" name="Phone"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffa0" size="17" name="company_name"> <INPUT type="hidden" title="" style="BACKGROUND-COLOR: #ffffff" size="20" name="special_accomodations_text"> <INPUT type="hidden" value="[email protected]" name="mf_from"> <INPUT type="hidden" value="[email protected]" name="mf_to"> </FORM> <FORM name="addform" action="https://webcluster.niu.edu/CreditCard/servlet/Shopping_Cart_Add_Item_Servlet" method="post"> <INPUT type="hidden" value="orient" name="Dept_ID"> <INPUT type="hidden" value="Orientation" name="Product_Name"> <INPUT type="hidden" value="z000000" name="Product_Code"> <INPUT type="hidden" value="" name="amount"> <INPUT type="hidden" value="/orientation/index.shtml" name="return_link"> <INPUT type="hidden" value="http://www.niu.edu" name="return_server"> <INPUT type="hidden" value="1" name="quantity"> <INPUT type="hidden" value="0" name="tax"> <INPUT type="hidden" value="0" name="ship"> <INPUT type="hidden" value="DQ83225" name="sale_id"> <INPUT type="hidden" value="XXXXXX" name="sale_acct"> </FORM> <SCRIPT language="Javascript"> function PaymentButtonClick() { switch (document.Information.amount.value) { case 'prof': document.Information.amount.value = 90.00; break; case 'grad': document.Information.amount.value = 55.00; break; } document.addform.Product_Name.value = document.Information.FirstName.value + ","+ document.Information.LastName.value+","+ document.Information.OfficialTitle.value+","+ document.Information.email.name+","+","+ document.Information.Address.value+ "," + document.Information.City.value+ "," + document.Information.State.value+ "," + document.Information.Zip.value+ "," + document.Information.Phone.value+ "," + document.Information.company_name.value+ "," + document.Information.special_accomodations_text.value; document.addform.Product_Code.value = document.Information.LastName.value; if ((document.Information.UCheck.checked==true) && (document.Information.altDate1.value != "") && (document.Information.altDate1.value != "x")) { if (document.Information.StudentLastName.value != "" || document.Information.StudentFirstName.value != "" || document.Information.StudentID.value != "" ) { document.addform.submit(); } else { alert("Please enter missing information"); } } } </SCRIPT> </body> </html>

    Read the article

  • Installing vim7.2 on Solaris Sparc 10 as non-root

    - by Tobbe
    I'm trying to install vim to $HOME/bin by compiling the sources. ./configure --prefix=$home/bin seems to work, but when running make I get: > make Starting make in the src directory. If there are problems, cd to the src directory and run make there cd src && make first gcc -c -I. -Iproto -DHAVE_CONFIG_H -DFEAT_GUI_GTK -I/usr/include/gtk-2.0 -I/usr/lib/gtk-2.0/include -I/usr/include/atk-1.0 -I/usr/include/pango-1.0 -I/usr/openwin/include -I/usr/sfw/include -I/usr/sfw/include/freetype2 -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -g -O2 -I/usr/openwin/include -o objects/buffer.o buffer.c In file included from buffer.c:28: vim.h:41: error: syntax error before ':' token In file included from os_unix.h:29, from vim.h:245, from buffer.c:28: /usr/include/sys/stat.h:251: error: syntax error before "blksize_t" /usr/include/sys/stat.h:255: error: syntax error before '}' token /usr/include/sys/stat.h:309: error: syntax error before "blksize_t" /usr/include/sys/stat.h:310: error: conflicting types for 'st_blocks' /usr/include/sys/stat.h:252: error: previous declaration of 'st_blocks' was here /usr/include/sys/stat.h:313: error: syntax error before '}' token In file included from /opt/local/bin/../lib/gcc/sparc-sun-solaris2.6/3.4.6/include/sys/signal.h:132, from /usr/include/signal.h:26, from os_unix.h:163, from vim.h:245, from buffer.c:28: /usr/include/sys/siginfo.h:259: error: syntax error before "ctid_t" /usr/include/sys/siginfo.h:292: error: syntax error before '}' token /usr/include/sys/siginfo.h:294: error: syntax error before '}' token /usr/include/sys/siginfo.h:390: error: syntax error before "ctid_t" /usr/include/sys/siginfo.h:398: error: conflicting types for '__fault' /usr/include/sys/siginfo.h:267: error: previous declaration of '__fault' was here /usr/include/sys/siginfo.h:404: error: conflicting types for '__file' /usr/include/sys/siginfo.h:273: error: previous declaration of '__file' was here /usr/include/sys/siginfo.h:420: error: conflicting types for '__prof' /usr/include/sys/siginfo.h:287: error: previous declaration of '__prof' was here /usr/include/sys/siginfo.h:424: error: conflicting types for '__rctl' /usr/include/sys/siginfo.h:291: error: previous declaration of '__rctl' was here /usr/include/sys/siginfo.h:426: error: syntax error before '}' token /usr/include/sys/siginfo.h:428: error: syntax error before '}' token /usr/include/sys/siginfo.h:432: error: syntax error before "k_siginfo_t" /usr/include/sys/siginfo.h:437: error: syntax error before '}' token In file included from /usr/include/signal.h:26, from os_unix.h:163, from vim.h:245, from buffer.c:28: /opt/local/bin/../lib/gcc/sparc-sun-solaris2.6/3.4.6/include/sys/signal.h:173: error: syntax error before "siginfo_t" In file included from os_unix.h:163, from vim.h:245, from buffer.c:28: /usr/include/signal.h:111: error: syntax error before "siginfo_t" /usr/include/signal.h:113: error: syntax error before "siginfo_t" buffer.c: In function `buflist_new': buffer.c:1502: error: storage size of 'st' isn't known buffer.c: In function `buflist_findname': buffer.c:1989: error: storage size of 'st' isn't known buffer.c: In function `setfname': buffer.c:2578: error: storage size of 'st' isn't known buffer.c: In function `otherfile_buf': buffer.c:2836: error: storage size of 'st' isn't known buffer.c: In function `buf_setino': buffer.c:2874: error: storage size of 'st' isn't known buffer.c: In function `buf_same_ino': buffer.c:2894: error: dereferencing pointer to incomplete type buffer.c:2895: error: dereferencing pointer to incomplete type *** Error code 1 make: Fatal error: Command failed for target `objects/buffer.o' Current working directory /home/xluntor/vim72/src *** Error code 1 make: Fatal error: Command failed for target `first' How do I fix the make errors? Or is there another way to install vim as non-root? Thanks in advance EDIT: I took a look at the google groups link Sarah posted. The "Compiling Vim" page linked from there was for Linux, so the commands doesn't even work on Solars. But it did hint at logging the output of ./configure to a file, so I did that. Here it is: ./configure output removed. New version further down. Does anyone spot anything critical missing? EDIT 2: So I downloaded the vim package from sunfreeware. I couldn't just install it, since I don't have root privileges, but I was able to extract the package file. This was the file structure in it: `-- SMCvim `-- reloc |-- bin |-- doc | `-- vim `-- share |-- man | `-- man1 `-- vim `-- vim72 |-- autoload | `-- xml |-- colors |-- compiler |-- doc |-- ftplugin |-- indent |-- keymap |-- lang |-- macros | |-- hanoi | |-- life | |-- maze | `-- urm |-- plugin |-- print |-- spell |-- syntax |-- tools `-- tutor I moved the three files (vim, vimtutor, xdd) in SMCvim/reloc/bin to $HOME/bin, so now I can finally run $HOME/bin/vim! But where do I put the "share" directory and its content? EDIT 3: It might also be worth noting that there already exists an install of vim on the system, but it is broken. When I try to run it I get: ld.so.1: vim: fatal: libgtk-1.2.so.0: open failed: No such file or directory "which vim" outputs /opt/local/bin/vim EDIT 4: Trying to compile this on Solaris 10. uname -a SunOS ws005-22 5.10 Generic_141414-10 sun4u sparc SUNW,SPARC-Enterprise New ./configure output: ./configure --prefix=$home/bin ac_cv_sizeof_int=8 --enable-rubyinterp configure: loading cache auto/config.cache checking whether make sets $(MAKE)... yes checking for gcc... gcc checking for C compiler default output file name... a.out checking whether the C compiler works... yes checking whether we are cross compiling... no checking for suffix of executables... checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... unsupported checking how to run the C preprocessor... gcc -E checking for grep that handles long lines and -e... /usr/sfw/bin/ggrep checking for egrep... /usr/sfw/bin/ggrep -E checking for library containing strerror... none required checking for gawk... gawk checking for strip... strip checking for ANSI C header files... yes checking for sys/wait.h that is POSIX.1 compatible... no configure: checking for buggy tools... checking for BeOS... no checking for QNX... no checking for Darwin (Mac OS X)... no checking --with-local-dir argument... Defaulting to /usr/local checking --with-vim-name argument... Defaulting to vim checking --with-ex-name argument... Defaulting to ex checking --with-view-name argument... Defaulting to view checking --with-global-runtime argument... no checking --with-modified-by argument... no checking if character set is EBCDIC... no checking --disable-selinux argument... no checking for is_selinux_enabled in -lselinux... no checking --with-features argument... Defaulting to normal checking --with-compiledby argument... no checking --disable-xsmp argument... no checking --disable-xsmp-interact argument... no checking --enable-mzschemeinterp argument... no checking --enable-perlinterp argument... no checking --enable-pythoninterp argument... no checking --enable-tclinterp argument... no checking --enable-rubyinterp argument... yes checking for ruby... /opt/sfw/bin/ruby checking Ruby version... OK checking Ruby header files... /opt/sfw/lib/ruby/1.6/sparc-solaris2.10 checking --enable-cscope argument... no checking --enable-workshop argument... no checking --disable-netbeans argument... no checking for socket in -lsocket... yes checking for gethostbyname in -lnsl... yes checking whether compiling netbeans integration is possible... no checking --enable-sniff argument... no checking --enable-multibyte argument... no checking --enable-hangulinput argument... no checking --enable-xim argument... defaulting to auto checking --enable-fontset argument... no checking for xmkmf... /usr/openwin/bin/xmkmf checking for X... libraries /usr/openwin/lib, headers /usr/openwin/include checking whether -R must be followed by a space... no checking for gethostbyname... yes checking for connect... yes checking for remove... yes checking for shmat... yes checking for IceConnectionNumber in -lICE... yes checking if X11 header files can be found... yes checking for _XdmcpAuthDoIt in -lXdmcp... no checking for IceOpenConnection in -lICE... yes checking for XpmCreatePixmapFromData in -lXpm... yes checking if X11 header files implicitly declare return values... no checking --enable-gui argument... yes/auto - automatic GUI support checking whether or not to look for GTK... yes checking whether or not to look for GTK+ 2... yes checking whether or not to look for GNOME... no checking whether or not to look for Motif... yes checking whether or not to look for Athena... yes checking whether or not to look for neXtaw... yes checking whether or not to look for Carbon... yes checking --with-gtk-prefix argument... no checking --with-gtk-exec-prefix argument... no checking --disable-gtktest argument... gtk test enabled checking for gtk-config... /opt/local/bin/gtk-config checking for pkg-config... /usr/bin/pkg-config checking for GTK - version = 2.2.0... yes; found version 2.4.9 checking X11/SM/SMlib.h usability... yes checking X11/SM/SMlib.h presence... yes checking for X11/SM/SMlib.h... yes checking X11/xpm.h usability... yes checking X11/xpm.h presence... yes checking for X11/xpm.h... yes checking X11/Sunkeysym.h usability... yes checking X11/Sunkeysym.h presence... yes checking for X11/Sunkeysym.h... yes checking for XIMText in X11/Xlib.h... yes X GUI selected; xim has been enabled checking whether toupper is broken... no checking whether __DATE__ and __TIME__ work... yes checking elf.h usability... yes checking elf.h presence... yes checking for elf.h... yes checking for main in -lelf... yes checking for dirent.h that defines DIR... yes checking for library containing opendir... none required checking for sys/wait.h that defines union wait... no checking stdarg.h usability... yes checking stdarg.h presence... yes checking for stdarg.h... yes checking stdlib.h usability... yes checking stdlib.h presence... yes checking for stdlib.h... yes checking string.h usability... yes checking string.h presence... yes checking for string.h... yes checking sys/select.h usability... yes checking sys/select.h presence... yes checking for sys/select.h... yes checking sys/utsname.h usability... yes checking sys/utsname.h presence... yes checking for sys/utsname.h... yes checking termcap.h usability... yes checking termcap.h presence... yes checking for termcap.h... yes checking fcntl.h usability... yes checking fcntl.h presence... yes checking for fcntl.h... yes checking sgtty.h usability... yes checking sgtty.h presence... yes checking for sgtty.h... yes checking sys/ioctl.h usability... yes checking sys/ioctl.h presence... yes checking for sys/ioctl.h... yes checking sys/time.h usability... yes checking sys/time.h presence... yes checking for sys/time.h... yes checking sys/types.h usability... yes checking sys/types.h presence... yes checking for sys/types.h... yes checking termio.h usability... yes checking termio.h presence... yes checking for termio.h... yes checking iconv.h usability... yes checking iconv.h presence... yes checking for iconv.h... yes checking langinfo.h usability... yes checking langinfo.h presence... yes checking for langinfo.h... yes checking math.h usability... yes checking math.h presence... yes checking for math.h... yes checking unistd.h usability... yes checking unistd.h presence... yes checking for unistd.h... yes checking stropts.h usability... no checking stropts.h presence... yes configure: WARNING: stropts.h: present but cannot be compiled configure: WARNING: stropts.h: check for missing prerequisite headers? configure: WARNING: stropts.h: see the Autoconf documentation configure: WARNING: stropts.h: section "Present But Cannot Be Compiled" configure: WARNING: stropts.h: proceeding with the preprocessor's result configure: WARNING: stropts.h: in the future, the compiler will take precedence checking for stropts.h... yes checking errno.h usability... yes checking errno.h presence... yes checking for errno.h... yes checking sys/resource.h usability... yes checking sys/resource.h presence... yes checking for sys/resource.h... yes checking sys/systeminfo.h usability... yes checking sys/systeminfo.h presence... yes checking for sys/systeminfo.h... yes checking locale.h usability... yes checking locale.h presence... yes checking for locale.h... yes checking sys/stream.h usability... no checking sys/stream.h presence... yes configure: WARNING: sys/stream.h: present but cannot be compiled configure: WARNING: sys/stream.h: check for missing prerequisite headers? configure: WARNING: sys/stream.h: see the Autoconf documentation configure: WARNING: sys/stream.h: section "Present But Cannot Be Compiled" configure: WARNING: sys/stream.h: proceeding with the preprocessor's result configure: WARNING: sys/stream.h: in the future, the compiler will take precedence checking for sys/stream.h... yes checking termios.h usability... yes checking termios.h presence... yes checking for termios.h... yes checking libc.h usability... no checking libc.h presence... no checking for libc.h... no checking sys/statfs.h usability... yes checking sys/statfs.h presence... yes checking for sys/statfs.h... yes checking poll.h usability... yes checking poll.h presence... yes checking for poll.h... yes checking sys/poll.h usability... yes checking sys/poll.h presence... yes checking for sys/poll.h... yes checking pwd.h usability... yes checking pwd.h presence... yes checking for pwd.h... yes checking utime.h usability... yes checking utime.h presence... yes checking for utime.h... yes checking sys/param.h usability... yes checking sys/param.h presence... yes checking for sys/param.h... yes checking libintl.h usability... yes checking libintl.h presence... yes checking for libintl.h... yes checking libgen.h usability... yes checking libgen.h presence... yes checking for libgen.h... yes checking util/debug.h usability... no checking util/debug.h presence... no checking for util/debug.h... no checking util/msg18n.h usability... no checking util/msg18n.h presence... no checking for util/msg18n.h... no checking frame.h usability... no checking frame.h presence... no checking for frame.h... no checking sys/acl.h usability... yes checking sys/acl.h presence... yes checking for sys/acl.h... yes checking sys/access.h usability... no checking sys/access.h presence... no checking for sys/access.h... no checking sys/sysctl.h usability... no checking sys/sysctl.h presence... no checking for sys/sysctl.h... no checking sys/sysinfo.h usability... yes checking sys/sysinfo.h presence... yes checking for sys/sysinfo.h... yes checking wchar.h usability... yes checking wchar.h presence... yes checking for wchar.h... yes checking wctype.h usability... yes checking wctype.h presence... yes checking for wctype.h... yes checking for sys/ptem.h... no checking for pthread_np.h... no checking strings.h usability... yes checking strings.h presence... yes checking for strings.h... yes checking if strings.h can be included after string.h... yes checking whether gcc needs -traditional... no checking for an ANSI C-conforming const... yes checking for mode_t... yes checking for off_t... yes checking for pid_t... yes checking for size_t... yes checking for uid_t in sys/types.h... yes checking whether time.h and sys/time.h may both be included... yes checking for ino_t... yes checking for dev_t... yes checking for rlim_t... yes checking for stack_t... yes checking whether stack_t has an ss_base field... no checking --with-tlib argument... empty: automatic terminal library selection checking for tgetent in -lncurses... yes checking whether we talk terminfo... yes checking what tgetent() returns for an unknown terminal... zero checking whether termcap.h contains ospeed... yes checking whether termcap.h contains UP, BC and PC... yes checking whether tputs() uses outfuntype... no checking whether sys/select.h and sys/time.h may both be included... yes checking for /dev/ptc... no checking for SVR4 ptys... yes checking for ptyranges... don't know checking default tty permissions/group... can't determine - assume ptys are world accessable world checking return type of signal handlers... void checking for struct sigcontext... no checking getcwd implementation is broken... no checking for bcmp... yes checking for fchdir... yes checking for fchown... yes checking for fseeko... yes checking for fsync... yes checking for ftello... yes checking for getcwd... yes checking for getpseudotty... no checking for getpwnam... yes checking for getpwuid... yes checking for getrlimit... yes checking for gettimeofday... yes checking for getwd... yes checking for lstat... yes checking for memcmp... yes checking for memset... yes checking for nanosleep... no checking for opendir... yes checking for putenv... yes checking for qsort... yes checking for readlink... yes checking for select... yes checking for setenv... yes checking for setpgid... yes checking for setsid... yes checking for sigaltstack... yes checking for sigstack... yes checking for sigset... yes checking for sigsetjmp... yes checking for sigaction... yes checking for sigvec... no checking for strcasecmp... yes checking for strerror... yes checking for strftime... yes checking for stricmp... no checking for strncasecmp... yes checking for strnicmp... no checking for strpbrk... yes checking for strtol... yes checking for tgetent... yes checking for towlower... yes checking for towupper... yes checking for iswupper... yes checking for usleep... yes checking for utime... yes checking for utimes... yes checking for st_blksize... no checking whether stat() ignores a trailing slash... no checking for iconv_open()... yes; with -liconv checking for nl_langinfo(CODESET)... yes checking for strtod in -lm... yes checking for strtod() and other floating point functions... yes checking --disable-acl argument... no checking for acl_get_file in -lposix1e... no checking for acl_get_file in -lacl... no checking for POSIX ACL support... no checking for Solaris ACL support... yes checking for AIX ACL support... no checking --disable-gpm argument... no checking for gpm... no checking --disable-sysmouse argument... no checking for sysmouse... no checking for rename... yes checking for sysctl... not usable checking for sysinfo... not usable checking for sysinfo.mem_unit... no checking for sysconf... yes checking size of int... (cached) 8 checking whether memmove handles overlaps... yes checking for _xpg4_setrunelocale in -lxpg4... no checking how to create tags... ctags -t checking how to run man with a section nr... man -s checking --disable-nls argument... no checking for msgfmt... msgfmt checking for NLS... no "po/Makefile" - disabled checking dlfcn.h usability... yes checking dlfcn.h presence... yes checking for dlfcn.h... yes checking for dlopen()... yes checking for dlsym()... yes checking setjmp.h usability... yes checking setjmp.h presence... yes checking for setjmp.h... yes checking for GCC 3 or later... yes configure: updating cache auto/config.cache configure: creating auto/config.status config.status: creating auto/config.mk config.status: creating auto/config.h Make: make Starting make in the src directory. If there are problems, cd to the src directory and run make there cd src && make first mkdir objects CC="gcc -Iproto -DHAVE_CONFIG_H -DFEAT_GUI_GTK -I/usr/include/gtk-2.0 -I/usr/lib/gtk-2.0/include -I/usr/include/atk-1.0 -I/usr/include/pango-1.0 -I/usr/openwin/include -I/usr/sfw/include -I/usr/sfw/include/freetype2 -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -I/usr/openwin/include -I/opt/sfw/lib/ruby/1.6/sparc-solaris2.10 " srcdir=. sh ./osdef.sh gcc -c -I. -Iproto -DHAVE_CONFIG_H -DFEAT_GUI_GTK -I/usr/include/gtk-2.0 -I/usr/lib/gtk-2.0/include -I/usr/include/atk-1.0 -I/usr/include/pango-1.0 -I/usr/openwin/include -I/usr/sfw/include -I/usr/sfw/include/freetype2 -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -g -O2 -I/usr/openwin/include -I/opt/sfw/lib/ruby/1.6/sparc-solaris2.10 -o objects/buffer.o buffer.c In file included from os_unix.h:29, from vim.h:245, from buffer.c:28: /usr/include/sys/stat.h:251: error: syntax error before "blksize_t" /usr/include/sys/stat.h:255: error: syntax error before '}' token /usr/include/sys/stat.h:309: error: syntax error before "blksize_t" /usr/include/sys/stat.h:310: error: conflicting types for 'st_blocks' /usr/include/sys/stat.h:252: error: previous declaration of 'st_blocks' was here /usr/include/sys/stat.h:313: error: syntax error before '}' token In file included from /opt/local/bin/../lib/gcc/sparc-sun-solaris2.6/3.4.6/include/sys/signal.h:132, from /usr/include/signal.h:26, from os_unix.h:163, from vim.h:245, from buffer.c:28: /usr/include/sys/siginfo.h:259: error: syntax error before "ctid_t" /usr/include/sys/siginfo.h:292: error: syntax error before '}' token /usr/include/sys/siginfo.h:294: error: syntax error before '}' token /usr/include/sys/siginfo.h:390: error: syntax error before "ctid_t" /usr/include/sys/siginfo.h:398: error: conflicting types for '__fault' /usr/include/sys/siginfo.h:267: error: previous declaration of '__fault' was here /usr/include/sys/siginfo.h:404: error: conflicting types for '__file' /usr/include/sys/siginfo.h:273: error: previous declaration of '__file' was here /usr/include/sys/siginfo.h:420: error: conflicting types for '__prof' /usr/include/sys/siginfo.h:287: error: previous declaration of '__prof' was here /usr/include/sys/siginfo.h:424: error: conflicting types for '__rctl' /usr/include/sys/siginfo.h:291: error: previous declaration of '__rctl' was here /usr/include/sys/siginfo.h:426: error: syntax error before '}' token /usr/include/sys/siginfo.h:428: error: syntax error before '}' token /usr/include/sys/siginfo.h:432: error: syntax error before "k_siginfo_t" /usr/include/sys/siginfo.h:437: error: syntax error before '}' token In file included from /usr/include/signal.h:26, from os_unix.h:163, from vim.h:245, from buffer.c:28: /opt/local/bin/../lib/gcc/sparc-sun-solaris2.6/3.4.6/include/sys/signal.h:173: error: syntax error before "siginfo_t" In file included from os_unix.h:163, from vim.h:245, from buffer.c:28: /usr/include/signal.h:111: error: syntax error before "siginfo_t" /usr/include/signal.h:113: error: syntax error before "siginfo_t" buffer.c: In function `buflist_new': buffer.c:1502: error: storage size of 'st' isn't known buffer.c: In function `buflist_findname': buffer.c:1989: error: storage size of 'st' isn't known buffer.c: In function `setfname': buffer.c:2578: error: storage size of 'st' isn't known buffer.c: In function `otherfile_buf': buffer.c:2836: error: storage size of 'st' isn't known buffer.c: In function `buf_setino': buffer.c:2874: error: storage size of 'st' isn't known buffer.c: In function `buf_same_ino': buffer.c:2894: error: dereferencing pointer to incomplete type buffer.c:2895: error: dereferencing pointer to incomplete type *** Error code 1 make: Fatal error: Command failed for target `objects/buffer.o' Current working directory /home/xluntor/vim72/src *** Error code 1 make: Fatal error: Command failed for target `first'

    Read the article

  • I need advices: small memory footprint linux mail server with spam filtering

    - by petermolnar
    I have a VPS which is originally destined to be a webserver but some minimal mail capabilities are needed to be deployed as well, including sending and receiving as standalone server. The current setup is the following: Postfix reveices the mail, the users are in virtual tables, stored in MySQL on connection all servers are tested with policyd-weight service against some DNSBLs all mail is runs through SpamAssassin spamd with the help of spamc client the mail is then delivered with Dovecot 2' LDA (local delivery agent), virtual users as well As you saw... there's no virus scanner running, and that's for a reason: clamav eats all the memory possible and also, virus mails are all filtered out with this setup (I've tested the same with ClamAV enabled for 1,5 years, no virus mail ever got even to ClamAV) I don't use amavisd and I really don't want to. You only need that monster if you have plenty of memory and lots of simultaneous scanners. It's also a nightmare to fine tune by hand. I run policyd-weight instead of policyd and native DNSBLs in postfix. I don't like to send someone away because a single service listed them. Important statement: everything works fine. I receive very small amount of spam, nearly never get a false positive and most of the bad mail is stopped by policyd-weight. The only "problem" that I feel the services at total uses a bit much memory alltogether. I've already cut the modules of spamassassin (see below), but I'd really like to hear some advices how to cut the memory footprint as low as possible, mostly: what plugins SpamAssassin really needs and what are more or less useless, regarding to my current postfix & policyd-weight setup? SpamAssassin rules are also compiled with sa-compile (sa-update runs once a week from cron, compile runs right after that) These are some of the current configurations that may matter, please tell me if you need anything more. postfix/master.cf (parts only) dovecot unix - n n - - pipe flags=DRhu user=vmail:vmail argv=/usr/bin/spamc -e /usr/lib/dovecot/deliver -d ${recipient} -f {sender} postfix/main.cf (parts only) smtpd_helo_required = yes smtpd_helo_restrictions = permit_mynetworks, reject_invalid_hostname, permit smtpd_recipient_restrictions = permit_mynetworks, permit_sasl_authenticated, reject_invalid_hostname, reject_non_fqdn_hostname, reject_non_fqdn_recipient, reject_unknown_recipient_domain, reject_unauth_pipelining, reject_unauth_destination, check_policy_service inet:127.0.0.1:12525, permit policyd-weight.conf (parts only) $REJECTMSG = "550 Mail appeared to be SPAM or forged. Ask your Mail/DNS-Administrator to correct HELO and DNS MX settings or to get removed from DNSBLs"; $REJECTLEVEL = 4; $DEFER_STRING = 'IN_SPAMCOP= BOGUS_MX='; $DEFER_ACTION = '450'; $DEFER_LEVEL = 5; $DNSERRMSG = '450 No DNS entries for your MTA, HELO and Domain. Contact YOUR administrator'; # 1: ON, 0: OFF (default) # If ON request that ALL clients are only checked against RBLs $dnsbl_checks_only = 0; # 1: ON (default), 0: OFF # When set to ON it logs only RBLs which affect scoring (positive or negative) $LOG_BAD_RBL_ONLY = 1; ## DNSBL settings @dnsbl_score = ( # host, hit, miss, log name 'dnsbl.ahbl.org', 3, -1, 'dnsbl.ahbl.org', 'dnsbl.njabl.org', 3, -1, 'dnsbl.njabl.org', 'dnsbl.sorbs.net', 3, -1, 'dnsbl.sorbs.net', 'bl.spamcop.net', 3, -1, 'bl.spamcop.net', 'zen.spamhaus.org', 3, -1, 'zen.spamhaus.org', 'pbl.spamhaus.org', 3, -1, 'pbl.spamhaus.org', 'cbl.abuseat.org', 3, -1, 'cbl.abuseat.org', 'list.dsbl.org', 3, -1, 'list.dsbl.org', ); # If Client IP is listed in MORE DNSBLS than this var, it gets REJECTed immediately $MAXDNSBLHITS = 3; # alternatively, if the score of DNSBLs is ABOVE this level, reject immediately $MAXDNSBLSCORE = 9; $MAXDNSBLMSG = '550 Az levelezoszerveruk IP cime tul sok spamlistan talahato, kerjuk ellenorizze! / Your MTA is listed in too many DNSBLs; please check.'; ## RHSBL settings @rhsbl_score = ( 'multi.surbl.org', 4, 0, 'multi.surbl.org', 'rhsbl.ahbl.org', 4, 0, 'rhsbl.ahbl.org', 'dsn.rfc-ignorant.org', 4, 0, 'dsn.rfc-ignorant.org', # 'postmaster.rfc-ignorant.org', 0.1, 0, 'postmaster.rfc-ignorant.org', # 'abuse.rfc-ignorant.org', 0.1, 0, 'abuse.rfc-ignorant.org' ); # skip a RBL if this RBL had this many continuous errors $BL_ERROR_SKIP = 2; # skip a RBL for that many times $BL_SKIP_RELEASE = 10; ## cache stuff # must be a directory (add trailing slash) $LOCKPATH = '/var/run/policyd-weight/'; # socket path for the cache daemon. $SPATH = $LOCKPATH.'/polw.sock'; # how many seconds the cache may be idle before starting maintenance routines #NOTE: standard maintenance jobs happen regardless of this setting. $MAXIDLECACHE = 60; # after this number of requests do following maintenance jobs: checking for config changes $MAINTENANCE_LEVEL = 5; # negative (i.e. SPAM) result cache settings ################################## # set to 0 to disable caching for spam results. To this level the cache will be cleaned. $CACHESIZE = 2000; # at this number of entries cleanup takes place $CACHEMAXSIZE = 4000; $CACHEREJECTMSG = '550 temporarily blocked because of previous errors'; # after NTTL retries the cache entry is deleted $NTTL = 1; # client MUST NOT retry within this seconds in order to decrease TTL counter $NTIME = 30; # positve (i.,e. HAM) result cache settings ################################### # set to 0 to disable caching of HAM. To this number of entries the cache will be cleaned $POSCACHESIZE = 1000; # at this number of entries cleanup takes place $POSCACHEMAXSIZE = 2000; $POSCACHEMSG = 'using cached result'; #after PTTL requests the HAM entry must succeed one time the RBL checks again $PTTL = 60; # after $PTIME in HAM Cache the client must pass one time the RBL checks again. #Values must be nonfractal. Accepted time-units: s, m, h, d $PTIME = '3h'; # The client must pass this time the RBL checks in order to be listed as hard-HAM # After this time the client will pass immediately for PTTL within PTIME $TEMP_PTIME = '1d'; ## DNS settings # Retries for ONE DNS-Lookup $DNS_RETRIES = 1; # Retry-interval for ONE DNS-Lookup $DNS_RETRY_IVAL = 5; # max error count for unresponded queries in a complete policy query $MAXDNSERR = 3; $MAXDNSERRMSG = 'passed - too many local DNS-errors'; # persistent udp connection for DNS queries. #broken in Net::DNS version 0.51. Works with Net::DNS 0.53; DEFAULT: off $PUDP= 0; # Force the usage of Net::DNS for RBL lookups. # Normally policyd-weight tries to use a faster RBL lookup routine instead of Net::DNS $USE_NET_DNS = 0; # A list of space separated NS IPs # This overrides resolv.conf settings # Example: $NS = '1.2.3.4 1.2.3.5'; # DEFAULT: empty $NS = ''; # timeout for receiving from cache instance $IPC_TIMEOUT = 2; # If set to 1 policyd-weight closes connections to smtpd clients in order to avoid too many #established connections to one policyd-weight child $TRY_BALANCE = 0; # scores for checks, WARNING: they may manipulate eachother # or be factors for other scores. # HIT score, MISS Score @client_ip_eq_helo_score = (1.5, -1.25 ); @helo_score = (1.5, -2 ); @helo_score = (0, -2 ); @helo_from_mx_eq_ip_score= (1.5, -3.1 ); @helo_numeric_score= (2.5, 0 ); @from_match_regex_verified_helo= (1,-2 ); @from_match_regex_unverified_helo = (1.6, -1.5 ); @from_match_regex_failed_helo = (2.5, 0 ); @helo_seems_dialup = (1.5, 0 ); @failed_helo_seems_dialup= (2, 0 ); @helo_ip_in_client_subnet= (0,-1.2 ); @helo_ip_in_cl16_subnet = (0,-0.41 ); #@client_seems_dialup_score = (3.75, 0 ); @client_seems_dialup_score = (0, 0 ); @from_multiparted = (1.09, 0 ); @from_anon= (1.17, 0 ); @bogus_mx_score = (2.1, 0 ); @random_sender_score = (0.25, 0 ); @rhsbl_penalty_score = (3.1, 0 ); @enforce_dyndns_score = (3, 0 ); spamassassin/init.pre (I've put the .pre files together) loadplugin Mail::SpamAssassin::Plugin::Hashcash loadplugin Mail::SpamAssassin::Plugin::SPF loadplugin Mail::SpamAssassin::Plugin::Pyzor loadplugin Mail::SpamAssassin::Plugin::Razor2 loadplugin Mail::SpamAssassin::Plugin::AutoLearnThreshold loadplugin Mail::SpamAssassin::Plugin::MIMEHeader loadplugin Mail::SpamAssassin::Plugin::ReplaceTags loadplugin Mail::SpamAssassin::Plugin::Check loadplugin Mail::SpamAssassin::Plugin::HTTPSMismatch loadplugin Mail::SpamAssassin::Plugin::URIDetail loadplugin Mail::SpamAssassin::Plugin::Bayes loadplugin Mail::SpamAssassin::Plugin::BodyEval loadplugin Mail::SpamAssassin::Plugin::DNSEval loadplugin Mail::SpamAssassin::Plugin::HTMLEval loadplugin Mail::SpamAssassin::Plugin::HeaderEval loadplugin Mail::SpamAssassin::Plugin::MIMEEval loadplugin Mail::SpamAssassin::Plugin::RelayEval loadplugin Mail::SpamAssassin::Plugin::URIEval loadplugin Mail::SpamAssassin::Plugin::WLBLEval loadplugin Mail::SpamAssassin::Plugin::VBounce loadplugin Mail::SpamAssassin::Plugin::Rule2XSBody spamassassin/local.cf (parts) use_bayes 1 bayes_auto_learn 1 bayes_store_module Mail::SpamAssassin::BayesStore::MySQL bayes_sql_dsn DBI:mysql:db:127.0.0.1:3306 bayes_sql_username user bayes_sql_password pass bayes_ignore_header X-Bogosity bayes_ignore_header X-Spam-Flag bayes_ignore_header X-Spam-Status ### User settings user_scores_dsn DBI:mysql:db:127.0.0.1:3306 user_scores_sql_password user user_scores_sql_username pass user_scores_sql_custom_query SELECT preference, value FROM _TABLE_ WHERE username = _USERNAME_ OR username = '$GLOBAL' OR username = CONCAT('%',_DOMAIN_) ORDER BY username ASC # for better speed score DNS_FROM_AHBL_RHSBL 0 score __RFC_IGNORANT_ENVFROM 0 score DNS_FROM_RFC_DSN 0 score DNS_FROM_RFC_BOGUSMX 0 score __DNS_FROM_RFC_POST 0 score __DNS_FROM_RFC_ABUSE 0 score __DNS_FROM_RFC_WHOIS 0 UPDATE 01 As adaptr advised I remove policyd-weight and configured postfix postscreen, this resulted approximately -15-20 MB from RAM usage and a lot faster work. I'm not sure it's working at full capacity but it seems promising.

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • Azure Mobile Services: what files does it consist of?

    - by svdoever
    Azure Mobile Services is a platform that provides a small set of functionality consisting of authentication, custom data tables, custom API’s, scheduling scripts and push notifications to be used as the back-end of a mobile application or if you want, any application or web site. As described in my previous post Azure Mobile Services: lessons learned the documentation on what can be used in the custom scripts is a bit minimalistic. The list below of all files the complete Azure Mobile Services platform consists of ca shed some light on what is available in the platform. In following posts I will provide more detailed information on what we can conclude from this list of files. Below are the available files as available in the Azure Mobile Services platform. The bold files are files that describe your data model, api scripts, scheduler scripts and table scripts. Those are the files you configure/construct to provide the “configuration”/implementation of you mobile service. The files are located in a folder like C:\DWASFiles\Sites\youreservice\VirtualDirectory0\site\wwwroot. One file is missing in the list below and that is the event log file C:\DWASFiles\Sites\youreservice\VirtualDirectory0\site\LogFiles\eventlog.xml where your messages written with for example console.log() and exception catched by the system are written. NOTA BENE: the Azure Mobile Services system is a system that is under full development, new releases may change the list of files. ./app.js ./App_Data/config/datamodel.json ./App_Data/config/scripts/api/youreapi.js ./App_Data/config/scripts/api/youreapi.json ./App_Data/config/scripts/scheduler/placeholder ./App_Data/config/scripts/scheduler/youresheduler.js ./App_Data/config/scripts/shared/placeholder ./App_Data/config/scripts/table/placeholder ./App_Data/config/scripts/table/yourtable.insert.js ./App_Data/config/scripts/table/yourtable.update.js ./App_Data/config/scripts/table/yourtable.delete.js ./App_Data/config/scripts/table/yourtable.read.js ./node_modules/apn/index.js ./node_modules/apn/lib/connection.js ./node_modules/apn/lib/device.js ./node_modules/apn/lib/errors.js ./node_modules/apn/lib/feedback.js ./node_modules/apn/lib/notification.js ./node_modules/apn/lib/util.js ./node_modules/apn/node_modules/q/package.json ./node_modules/apn/node_modules/q/q.js ./node_modules/apn/package.json ./node_modules/azure/lib/azure.js ./node_modules/azure/lib/cli/blobUtils.js ./node_modules/azure/lib/cli/cacheUtils.js ./node_modules/azure/lib/cli/callbackAggregator.js ./node_modules/azure/lib/cli/cert.js ./node_modules/azure/lib/cli/channel.js ./node_modules/azure/lib/cli/cli.js ./node_modules/azure/lib/cli/commands/account.js ./node_modules/azure/lib/cli/commands/config.js ./node_modules/azure/lib/cli/commands/deployment.js ./node_modules/azure/lib/cli/commands/deployment_.js ./node_modules/azure/lib/cli/commands/help.js ./node_modules/azure/lib/cli/commands/log.js ./node_modules/azure/lib/cli/commands/log_.js ./node_modules/azure/lib/cli/commands/repository.js ./node_modules/azure/lib/cli/commands/repository_.js ./node_modules/azure/lib/cli/commands/service.js ./node_modules/azure/lib/cli/commands/site.js ./node_modules/azure/lib/cli/commands/site_.js ./node_modules/azure/lib/cli/commands/vm.js ./node_modules/azure/lib/cli/common.js ./node_modules/azure/lib/cli/constants.js ./node_modules/azure/lib/cli/generate-psm1-utils.js ./node_modules/azure/lib/cli/generate-psm1.js ./node_modules/azure/lib/cli/iaas/blobserviceex.js ./node_modules/azure/lib/cli/iaas/deleteImage.js ./node_modules/azure/lib/cli/iaas/image.js ./node_modules/azure/lib/cli/iaas/upload/blobInfo.js ./node_modules/azure/lib/cli/iaas/upload/bufferStream.js ./node_modules/azure/lib/cli/iaas/upload/intSet.js ./node_modules/azure/lib/cli/iaas/upload/jobTracker.js ./node_modules/azure/lib/cli/iaas/upload/pageBlob.js ./node_modules/azure/lib/cli/iaas/upload/streamMerger.js ./node_modules/azure/lib/cli/iaas/upload/uploadVMImage.js ./node_modules/azure/lib/cli/iaas/upload/vhdTools.js ./node_modules/azure/lib/cli/keyFiles.js ./node_modules/azure/lib/cli/patch-winston.js ./node_modules/azure/lib/cli/templates/node/iisnode.yml ./node_modules/azure/lib/cli/utils.js ./node_modules/azure/lib/diagnostics/logger.js ./node_modules/azure/lib/http/webresource.js ./node_modules/azure/lib/serviceruntime/fileinputchannel.js ./node_modules/azure/lib/serviceruntime/goalstatedeserializer.js ./node_modules/azure/lib/serviceruntime/namedpipeinputchannel.js ./node_modules/azure/lib/serviceruntime/namedpipeoutputchannel.js ./node_modules/azure/lib/serviceruntime/protocol1runtimeclient.js ./node_modules/azure/lib/serviceruntime/protocol1runtimecurrentstateclient.js ./node_modules/azure/lib/serviceruntime/protocol1runtimegoalstateclient.js ./node_modules/azure/lib/serviceruntime/roleenvironment.js ./node_modules/azure/lib/serviceruntime/runtimekernel.js ./node_modules/azure/lib/serviceruntime/runtimeversionmanager.js ./node_modules/azure/lib/serviceruntime/runtimeversionprotocolclient.js ./node_modules/azure/lib/serviceruntime/xmlcurrentstateserializer.js ./node_modules/azure/lib/serviceruntime/xmlgoalstatedeserializer.js ./node_modules/azure/lib/serviceruntime/xmlroleenvironmentdatadeserializer.js ./node_modules/azure/lib/services/blob/blobservice.js ./node_modules/azure/lib/services/blob/hmacsha256sign.js ./node_modules/azure/lib/services/blob/models/blobresult.js ./node_modules/azure/lib/services/blob/models/blocklistresult.js ./node_modules/azure/lib/services/blob/models/containeraclresult.js ./node_modules/azure/lib/services/blob/models/containerresult.js ./node_modules/azure/lib/services/blob/models/leaseresult.js ./node_modules/azure/lib/services/blob/models/listblobsresultcontinuation.js ./node_modules/azure/lib/services/blob/models/listcontainersresultcontinuation.js ./node_modules/azure/lib/services/blob/models/servicepropertiesresult.js ./node_modules/azure/lib/services/blob/sharedaccesssignature.js ./node_modules/azure/lib/services/blob/sharedkey.js ./node_modules/azure/lib/services/blob/sharedkeylite.js ./node_modules/azure/lib/services/core/connectionstringparser.js ./node_modules/azure/lib/services/core/exponentialretrypolicyfilter.js ./node_modules/azure/lib/services/core/linearretrypolicyfilter.js ./node_modules/azure/lib/services/core/servicebusserviceclient.js ./node_modules/azure/lib/services/core/servicebussettings.js ./node_modules/azure/lib/services/core/serviceclient.js ./node_modules/azure/lib/services/core/servicemanagementclient.js ./node_modules/azure/lib/services/core/servicemanagementsettings.js ./node_modules/azure/lib/services/core/servicesettings.js ./node_modules/azure/lib/services/core/storageserviceclient.js ./node_modules/azure/lib/services/core/storageservicesettings.js ./node_modules/azure/lib/services/queue/models/listqueuesresultcontinuation.js ./node_modules/azure/lib/services/queue/models/queuemessageresult.js ./node_modules/azure/lib/services/queue/models/queueresult.js ./node_modules/azure/lib/services/queue/models/servicepropertiesresult.js ./node_modules/azure/lib/services/queue/queueservice.js ./node_modules/azure/lib/services/serviceBus/models/acstokenresult.js ./node_modules/azure/lib/services/serviceBus/models/queuemessageresult.js ./node_modules/azure/lib/services/serviceBus/models/queueresult.js ./node_modules/azure/lib/services/serviceBus/models/ruleresult.js ./node_modules/azure/lib/services/serviceBus/models/subscriptionresult.js ./node_modules/azure/lib/services/serviceBus/models/topicresult.js ./node_modules/azure/lib/services/serviceBus/servicebusservice.js ./node_modules/azure/lib/services/serviceBus/wrap.js ./node_modules/azure/lib/services/serviceBus/wrapservice.js ./node_modules/azure/lib/services/serviceBus/wraptokenmanager.js ./node_modules/azure/lib/services/serviceManagement/models/roleparser.js ./node_modules/azure/lib/services/serviceManagement/models/roleschema.json ./node_modules/azure/lib/services/serviceManagement/models/servicemanagementserialize.js ./node_modules/azure/lib/services/serviceManagement/servicemanagementservice.js ./node_modules/azure/lib/services/table/batchserviceclient.js ./node_modules/azure/lib/services/table/models/entityresult.js ./node_modules/azure/lib/services/table/models/queryentitiesresultcontinuation.js ./node_modules/azure/lib/services/table/models/querytablesresultcontinuation.js ./node_modules/azure/lib/services/table/models/servicepropertiesresult.js ./node_modules/azure/lib/services/table/models/tableresult.js ./node_modules/azure/lib/services/table/sharedkeylitetable.js ./node_modules/azure/lib/services/table/sharedkeytable.js ./node_modules/azure/lib/services/table/tablequery.js ./node_modules/azure/lib/services/table/tableservice.js ./node_modules/azure/lib/util/atomhandler.js ./node_modules/azure/lib/util/certificates/der.js ./node_modules/azure/lib/util/certificates/pkcs.js ./node_modules/azure/lib/util/constants.js ./node_modules/azure/lib/util/iso8061date.js ./node_modules/azure/lib/util/js2xml.js ./node_modules/azure/lib/util/rfc1123date.js ./node_modules/azure/lib/util/util.js ./node_modules/azure/lib/util/validate.js ./node_modules/azure/LICENSE.txt ./node_modules/azure/node_modules/async/index.js ./node_modules/azure/node_modules/async/lib/async.js ./node_modules/azure/node_modules/async/LICENSE ./node_modules/azure/node_modules/async/package.json ./node_modules/azure/node_modules/azure/lib/azure.js ./node_modules/azure/node_modules/azure/lib/diagnostics/logger.js ./node_modules/azure/node_modules/azure/lib/http/webresource.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/fileinputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/goalstatedeserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/namedpipeinputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/namedpipeoutputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimeclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimecurrentstateclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimegoalstateclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/roleenvironment.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimekernel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimeversionmanager.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimeversionprotocolclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlcurrentstateserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlgoalstatedeserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlroleenvironmentdatadeserializer.js ./node_modules/azure/node_modules/azure/lib/services/blob/blobservice.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedaccesssignature.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedkey.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedkeylite.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/blobresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/blocklistresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/containeraclresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/containerresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/leaseresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/listblobsresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/listcontainersresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/core/connectionstringparser.js ./node_modules/azure/node_modules/azure/lib/services/core/exponentialretrypolicyfilter.js ./node_modules/azure/node_modules/azure/lib/services/core/hmacsha256sign.js ./node_modules/azure/node_modules/azure/lib/services/core/linearretrypolicyfilter.js ./node_modules/azure/node_modules/azure/lib/services/core/servicebusserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicebussettings.js ./node_modules/azure/node_modules/azure/lib/services/core/serviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicemanagementclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicemanagementsettings.js ./node_modules/azure/node_modules/azure/lib/services/core/servicesettings.js ./node_modules/azure/node_modules/azure/lib/services/core/sqlserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/storageserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/storageservicesettings.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/listqueuesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/queuemessageresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/queueresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/queueservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/apnsservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/gcmservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/sharedaccesssignature.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/wrap.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/wraptokenmanager.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/acstokenresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/notificationhubresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/queuemessageresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/queueresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/registrationresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/resourceresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/ruleresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/subscriptionresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/topicresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/notificationhubservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/servicebusservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/servicebusservicebase.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/wnsservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/wrapservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/hdinsightservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/roleparser.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/roleschema.json ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/servicemanagementserialize.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/servicebusmanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/servicemanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/sqlmanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/models/databaseresult.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/sqlserveracs.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/sqlservice.js ./node_modules/azure/node_modules/azure/lib/services/table/batchserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/table/internal/sharedkeylitetable.js ./node_modules/azure/node_modules/azure/lib/services/table/internal/sharedkeytable.js ./node_modules/azure/node_modules/azure/lib/services/table/models/entityresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/listresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/queryentitiesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/table/models/querytablesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/table/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/tableresult.js ./node_modules/azure/node_modules/azure/lib/services/table/tablequery.js ./node_modules/azure/node_modules/azure/lib/services/table/tableservice.js ./node_modules/azure/node_modules/azure/lib/util/atomhandler.js ./node_modules/azure/node_modules/azure/lib/util/constants.js ./node_modules/azure/node_modules/azure/lib/util/date.js ./node_modules/azure/node_modules/azure/lib/util/edmtype.js ./node_modules/azure/node_modules/azure/lib/util/iso8061date.js ./node_modules/azure/node_modules/azure/lib/util/js2xml.js ./node_modules/azure/node_modules/azure/lib/util/odatahandler.js ./node_modules/azure/node_modules/azure/lib/util/rfc1123date.js ./node_modules/azure/node_modules/azure/lib/util/util.js ./node_modules/azure/node_modules/azure/lib/util/validate.js ./node_modules/azure/node_modules/azure/LICENSE.txt ./node_modules/azure/node_modules/azure/node_modules/wns/lib/wns.js ./node_modules/azure/node_modules/azure/node_modules/wns/LICENSE.txt ./node_modules/azure/node_modules/azure/node_modules/wns/package.json ./node_modules/azure/node_modules/azure/node_modules/xml2js/lib/xml2js.js ./node_modules/azure/node_modules/azure/node_modules/xml2js/LICENSE ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/lib/sax.js ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/LICENSE ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/package.json ./node_modules/azure/node_modules/azure/node_modules/xml2js/package.json ./node_modules/azure/node_modules/azure/package.json ./node_modules/azure/node_modules/colors/colors.js ./node_modules/azure/node_modules/colors/MIT-LICENSE.txt ./node_modules/azure/node_modules/colors/package.json ./node_modules/azure/node_modules/commander/index.js ./node_modules/azure/node_modules/commander/lib/commander.js ./node_modules/azure/node_modules/commander/node_modules/keypress/index.js ./node_modules/azure/node_modules/commander/node_modules/keypress/package.json ./node_modules/azure/node_modules/commander/package.json ./node_modules/azure/node_modules/dateformat/lib/dateformat.js ./node_modules/azure/node_modules/dateformat/package.json ./node_modules/azure/node_modules/easy-table/lib/table.js ./node_modules/azure/node_modules/easy-table/package.json ./node_modules/azure/node_modules/eyes/lib/eyes.js ./node_modules/azure/node_modules/eyes/LICENSE ./node_modules/azure/node_modules/eyes/package.json ./node_modules/azure/node_modules/log/index.js ./node_modules/azure/node_modules/log/lib/log.js ./node_modules/azure/node_modules/log/package.json ./node_modules/azure/node_modules/mime/LICENSE ./node_modules/azure/node_modules/mime/mime.js ./node_modules/azure/node_modules/mime/package.json ./node_modules/azure/node_modules/mime/types/mime.types ./node_modules/azure/node_modules/mime/types/node.types ./node_modules/azure/node_modules/node-uuid/LICENSE.md ./node_modules/azure/node_modules/node-uuid/package.json ./node_modules/azure/node_modules/node-uuid/uuid.js ./node_modules/azure/node_modules/qs/component.json ./node_modules/azure/node_modules/qs/index.js ./node_modules/azure/node_modules/qs/lib/head.js ./node_modules/azure/node_modules/qs/lib/querystring.js ./node_modules/azure/node_modules/qs/lib/tail.js ./node_modules/azure/node_modules/qs/package.json ./node_modules/azure/node_modules/qs/querystring.js ./node_modules/azure/node_modules/request/aws.js ./node_modules/azure/node_modules/request/forever.js ./node_modules/azure/node_modules/request/LICENSE ./node_modules/azure/node_modules/request/main.js ./node_modules/azure/node_modules/request/node_modules/form-data/lib/form_data.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/index.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/lib/async.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/LICENSE ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/lib/combined_stream.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/License ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/lib/delayed_stream.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/License ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/package.json ./node_modules/azure/node_modules/request/node_modules/mime/LICENSE ./node_modules/azure/node_modules/request/node_modules/mime/mime.js ./node_modules/azure/node_modules/request/node_modules/mime/package.json ./node_modules/azure/node_modules/request/node_modules/mime/types/mime.types ./node_modules/azure/node_modules/request/node_modules/mime/types/node.types ./node_modules/azure/node_modules/request/oauth.js ./node_modules/azure/node_modules/request/package.json ./node_modules/azure/node_modules/request/tunnel.js ./node_modules/azure/node_modules/request/uuid.js ./node_modules/azure/node_modules/request/vendor/cookie/index.js ./node_modules/azure/node_modules/request/vendor/cookie/jar.js ./node_modules/azure/node_modules/sax/lib/sax.js ./node_modules/azure/node_modules/sax/LICENSE ./node_modules/azure/node_modules/sax/package.json ./node_modules/azure/node_modules/streamline/AUTHORS ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/decompiler.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/definitions.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsbrowser.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsdecomp.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsdefs.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsexec.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jslex.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsparse.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/lexer.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/parser.js ./node_modules/azure/node_modules/streamline/deps/narcissus/LICENSE ./node_modules/azure/node_modules/streamline/deps/narcissus/main.js ./node_modules/azure/node_modules/streamline/deps/narcissus/package.json ./node_modules/azure/node_modules/streamline/deps/narcissus/xfail/narcissus-failures.txt ./node_modules/azure/node_modules/streamline/deps/narcissus/xfail/narcissus-slow.txt ./node_modules/azure/node_modules/streamline/lib/callbacks/format.js ./node_modules/azure/node_modules/streamline/lib/callbacks/require-stub.js ./node_modules/azure/node_modules/streamline/lib/callbacks/runtime.js ./node_modules/azure/node_modules/streamline/lib/callbacks/transform.js ./node_modules/azure/node_modules/streamline/lib/compile.js ./node_modules/azure/node_modules/streamline/lib/compiler/command.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile--fibers.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile_.js ./node_modules/azure/node_modules/streamline/lib/compiler/index.js ./node_modules/azure/node_modules/streamline/lib/compiler/register.js ./node_modules/azure/node_modules/streamline/lib/fibers/runtime.js ./node_modules/azure/node_modules/streamline/lib/fibers/transform.js ./node_modules/azure/node_modules/streamline/lib/fibers/walker.js ./node_modules/azure/node_modules/streamline/lib/globals.js ./node_modules/azure/node_modules/streamline/lib/index.js ./node_modules/azure/node_modules/streamline/lib/register.js ./node_modules/azure/node_modules/streamline/lib/require/client/require.js ./node_modules/azure/node_modules/streamline/lib/require/server/depend.js ./node_modules/azure/node_modules/streamline/lib/require/server/require.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams--fibers.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams_.js ./node_modules/azure/node_modules/streamline/lib/streams/jsonRequest.js ./node_modules/azure/node_modules/streamline/lib/streams/readers.js ./node_modules/azure/node_modules/streamline/lib/streams/server/httpHelper.js ./node_modules/azure/node_modules/streamline/lib/streams/server/streams.js ./node_modules/azure/node_modules/streamline/lib/streams/streams.js ./node_modules/azure/node_modules/streamline/lib/tools/docTool.js ./node_modules/azure/node_modules/streamline/lib/transform.js ./node_modules/azure/node_modules/streamline/lib/util/flows--fibers.js ./node_modules/azure/node_modules/streamline/lib/util/flows.js ./node_modules/azure/node_modules/streamline/lib/util/flows_.js ./node_modules/azure/node_modules/streamline/lib/util/future.js ./node_modules/azure/node_modules/streamline/lib/util/index.js ./node_modules/azure/node_modules/streamline/lib/util/url.js ./node_modules/azure/node_modules/streamline/lib/util/uuid.js ./node_modules/azure/node_modules/streamline/module.js ./node_modules/azure/node_modules/streamline/package.json ./node_modules/azure/node_modules/tunnel/index.js ./node_modules/azure/node_modules/tunnel/lib/tunnel.js ./node_modules/azure/node_modules/tunnel/package.json ./node_modules/azure/node_modules/underscore/index.js ./node_modules/azure/node_modules/underscore/LICENSE ./node_modules/azure/node_modules/underscore/package.json ./node_modules/azure/node_modules/underscore/underscore.js ./node_modules/azure/node_modules/underscore.string/lib/underscore.string.js ./node_modules/azure/node_modules/underscore.string/package.json ./node_modules/azure/node_modules/validator/index.js ./node_modules/azure/node_modules/validator/lib/defaultError.js ./node_modules/azure/node_modules/validator/lib/entities.js ./node_modules/azure/node_modules/validator/lib/filter.js ./node_modules/azure/node_modules/validator/lib/index.js ./node_modules/azure/node_modules/validator/lib/validator.js ./node_modules/azure/node_modules/validator/lib/validators.js ./node_modules/azure/node_modules/validator/lib/xss.js ./node_modules/azure/node_modules/validator/LICENSE ./node_modules/azure/node_modules/validator/package.json ./node_modules/azure/node_modules/validator/validator.js ./node_modules/azure/node_modules/winston/lib/winston/common.js ./node_modules/azure/node_modules/winston/lib/winston/config/cli-config.js ./node_modules/azure/node_modules/winston/lib/winston/config/npm-config.js ./node_modules/azure/node_modules/winston/lib/winston/config/syslog-config.js ./node_modules/azure/node_modules/winston/lib/winston/config.js ./node_modules/azure/node_modules/winston/lib/winston/container.js ./node_modules/azure/node_modules/winston/lib/winston/exception.js ./node_modules/azure/node_modules/winston/lib/winston/logger.js ./node_modules/azure/node_modules/winston/lib/winston/transports/console.js ./node_modules/azure/node_modules/winston/lib/winston/transports/file.js ./node_modules/azure/node_modules/winston/lib/winston/transports/http.js ./node_modules/azure/node_modules/winston/lib/winston/transports/transport.js ./node_modules/azure/node_modules/winston/lib/winston/transports/webhook.js ./node_modules/azure/node_modules/winston/lib/winston/transports.js ./node_modules/azure/node_modules/winston/lib/winston.js ./node_modules/azure/node_modules/winston/LICENSE ./node_modules/azure/node_modules/winston/node_modules/cycle/cycle.js ./node_modules/azure/node_modules/winston/node_modules/cycle/package.json ./node_modules/azure/node_modules/winston/node_modules/pkginfo/lib/pkginfo.js ./node_modules/azure/node_modules/winston/node_modules/pkginfo/package.json ./node_modules/azure/node_modules/winston/node_modules/request/aws.js ./node_modules/azure/node_modules/winston/node_modules/request/aws2.js ./node_modules/azure/node_modules/winston/node_modules/request/forever.js ./node_modules/azure/node_modules/winston/node_modules/request/LICENSE ./node_modules/azure/node_modules/winston/node_modules/request/main.js ./node_modules/azure/node_modules/winston/node_modules/request/mimetypes.js ./node_modules/azure/node_modules/winston/node_modules/request/oauth.js ./node_modules/azure/node_modules/winston/node_modules/request/package.json ./node_modules/azure/node_modules/winston/node_modules/request/tunnel.js ./node_modules/azure/node_modules/winston/node_modules/request/uuid.js ./node_modules/azure/node_modules/winston/node_modules/request/vendor/cookie/index.js ./node_modules/azure/node_modules/winston/node_modules/request/vendor/cookie/jar.js ./node_modules/azure/node_modules/winston/node_modules/stack-trace/lib/stack-trace.js ./node_modules/azure/node_modules/winston/node_modules/stack-trace/License ./node_modules/azure/node_modules/winston/node_modules/stack-trace/package.json ./node_modules/azure/node_modules/winston/package.json ./node_modules/azure/node_modules/xml2js/lib/xml2js.js ./node_modules/azure/node_modules/xml2js/LICENSE ./node_modules/azure/node_modules/xml2js/package.json ./node_modules/azure/node_modules/xmlbuilder/lib/index.js ./node_modules/azure/node_modules/xmlbuilder/lib/XMLBuilder.js ./node_modules/azure/node_modules/xmlbuilder/lib/XMLFragment.js ./node_modules/azure/node_modules/xmlbuilder/package.json ./node_modules/azure/package.json ./node_modules/dpush/lib/dpush.js ./node_modules/dpush/LICENSE.txt ./node_modules/dpush/package.json ./node_modules/express/.npmignore ./node_modules/express/.travis.yml ./node_modules/express/bin/express ./node_modules/express/History.md ./node_modules/express/index.js ./node_modules/express/lib/application.js ./node_modules/express/lib/express.js ./node_modules/express/lib/middleware.js ./node_modules/express/lib/request.js ./node_modules/express/lib/response.js ./node_modules/express/lib/router/index.js ./node_modules/express/lib/router/route.js ./node_modules/express/lib/utils.js ./node_modules/express/lib/view.js ./node_modules/express/LICENSE ./node_modules/express/Makefile ./node_modules/express/node_modules/buffer-crc32/.npmignore ./node_modules/express/node_modules/buffer-crc32/.travis.yml ./node_modules/express/node_modules/buffer-crc32/index.js ./node_modules/express/node_modules/buffer-crc32/package.json ./node_modules/express/node_modules/buffer-crc32/README.md ./node_modules/express/node_modules/buffer-crc32/tests/crc.test.js ./node_modules/express/node_modules/commander/.npmignore ./node_modules/express/node_modules/commander/.travis.yml ./node_modules/express/node_modules/commander/History.md ./node_modules/express/node_modules/commander/index.js ./node_modules/express/node_modules/commander/lib/commander.js ./node_modules/express/node_modules/commander/Makefile ./node_modules/express/node_modules/commander/package.json ./node_modules/express/node_modules/commander/Readme.md ./node_modules/express/node_modules/connect/.npmignore ./node_modules/express/node_modules/connect/.travis.yml ./node_modules/express/node_modules/connect/index.js ./node_modules/express/node_modules/connect/lib/cache.js ./node_modules/express/node_modules/connect/lib/connect.js ./node_modules/express/node_modules/connect/lib/index.js ./node_modules/express/node_modules/connect/lib/middleware/basicAuth.js ./node_modules/express/node_modules/connect/lib/middleware/bodyParser.js ./node_modules/express/node_modules/connect/lib/middleware/compress.js ./node_modules/express/node_modules/connect/lib/middleware/cookieParser.js ./node_modules/express/node_modules/connect/lib/middleware/cookieSession.js ./node_modules/express/node_modules/connect/lib/middleware/csrf.js ./node_modules/express/node_modules/connect/lib/middleware/directory.js ./node_modules/express/node_modules/connect/lib/middleware/errorHandler.js ./node_modules/express/node_modules/connect/lib/middleware/favicon.js ./node_modules/express/node_modules/connect/lib/middleware/json.js ./node_modules/express/node_modules/connect/lib/middleware/limit.js ./node_modules/express/node_modules/connect/lib/middleware/logger.js ./node_modules/express/node_modules/connect/lib/middleware/methodOverride.js ./node_modules/express/node_modules/connect/lib/middleware/multipart.js ./node_modules/express/node_modules/connect/lib/middleware/query.js ./node_modules/express/node_modules/connect/lib/middleware/responseTime.js ./node_modules/express/node_modules/connect/lib/middleware/session/cookie.js ./node_modules/express/node_modules/connect/lib/middleware/session/memory.js ./node_modules/express/node_modules/connect/lib/middleware/session/session.js ./node_modules/express/node_modules/connect/lib/middleware/session/store.js ./node_modules/express/node_modules/connect/lib/middleware/session.js ./node_modules/express/node_modules/connect/lib/middleware/static.js ./node_modules/express/node_modules/connect/lib/middleware/staticCache.js ./node_modules/express/node_modules/connect/lib/middleware/timeout.js ./node_modules/express/node_modules/connect/lib/middleware/urlencoded.js ./node_modules/express/node_modules/connect/lib/middleware/vhost.js ./node_modules/express/node_modules/connect/lib/patch.js ./node_modules/express/node_modules/connect/lib/proto.js ./node_modules/express/node_modules/connect/lib/public/directory.html ./node_modules/express/node_modules/connect/lib/public/error.html ./node_modules/express/node_modules/connect/lib/public/favicon.ico ./node_modules/express/node_modules/connect/lib/public/icons/page.png ./node_modules/express/node_modules/connect/lib/public/icons/page_add.png ./node_modules/express/node_modules/connect/lib/public/icons/page_attach.png ./node_modules/express/node_modules/connect/lib/public/icons/page_code.png ./node_modules/express/node_modules/connect/lib/public/icons/page_copy.png ./node_modules/express/node_modules/connect/lib/public/icons/page_delete.png ./node_modules/express/node_modules/connect/lib/public/icons/page_edit.png ./node_modules/express/node_modules/connect/lib/public/icons/page_error.png ./node_modules/express/node_modules/connect/lib/public/icons/page_excel.png ./node_modules/express/node_modules/connect/lib/public/icons/page_find.png ./node_modules/express/node_modules/connect/lib/public/icons/page_gear.png ./node_modules/express/node_modules/connect/lib/public/icons/page_go.png ./node_modules/express/node_modules/connect/lib/public/icons/page_green.png ./node_modules/express/node_modules/connect/lib/public/icons/page_key.png ./node_modules/express/node_modules/connect/lib/public/icons/page_lightning.png ./node_modules/express/node_modules/connect/lib/public/icons/page_link.png ./node_modules/express/node_modules/connect/lib/public/icons/page_paintbrush.png ./node_modules/express/node_modules/connect/lib/public/icons/page_paste.png ./node_modules/express/node_modules/connect/lib/public/icons/page_red.png ./node_modules/express/node_modules/connect/lib/public/icons/page_refresh.png ./node_modules/express/node_modules/connect/lib/public/icons/page_save.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_acrobat.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_actionscript.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_add.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_c.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_camera.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cd.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_code.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_code_red.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_coldfusion.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_compressed.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_copy.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cplusplus.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_csharp.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cup.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_database.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_delete.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_dvd.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_edit.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_error.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_excel.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_find.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_flash.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_freehand.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_gear.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_get.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_go.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_h.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_horizontal.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_key.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_lightning.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_link.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_magnify.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_medal.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_office.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paint.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paintbrush.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paste.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_php.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_picture.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_powerpoint.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_put.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_ruby.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_stack.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_star.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_swoosh.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_text.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_text_width.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_tux.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_vector.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_visualstudio.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_width.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_word.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_world.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_wrench.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_zip.png ./node_modules/express/node_modules/connect/lib/public/icons/page_word.png ./node_modules/express/node_modules/connect/lib/public/icons/page_world.png ./node_modules/express/node_modules/connect/lib/public/style.css ./node_modules/express/node_modules/connect/lib/utils.js ./node_modules/express/node_modules/connect/LICENSE ./node_modules/express/node_modules/connect/node_modules/bytes/.npmignore ./node_modules/express/node_modules/connect/node_modules/bytes/component.json ./node_modules/express/node_modules/connect/node_modules/bytes/History.md ./node_modules/express/node_modules/connect/node_modules/bytes/index.js ./node_modules/express/node_modules/connect/node_modules/bytes/Makefile ./node_modules/express/node_modules/connect/node_modules/bytes/package.json ./node_modules/express/node_modules/connect/node_modules/bytes/Readme.md ./node_modules/express/node_modules/connect/node_modules/formidable/.npmignore ./node_modules/express/node_modules/connect/node_modules/formidable/.travis.yml ./node_modules/express/node_modules/connect/node_modules/formidable/benchmark/bench-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/json.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/post.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/upload.js ./node_modules/express/node_modules/connect/node_modules/formidable/index.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/file.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/incoming_form.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/index.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/json_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/multipart_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/octet_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/querystring_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/LICENSE ./node_modules/express/node_modules/connect/node_modules/formidable/package.json ./node_modules/express/node_modules/connect/node_modules/formidable/Readme.md ./node_modules/express/node_modules/connect/node_modules/formidable/test/common.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/beta-sticker-1.png ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/binaryfile.tar.gz ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/blank.gif ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/funkyfilename.txt ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/menu_separator.png ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/plain.txt ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/http/special-chars-in-filename/info.md ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/encoding.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/misc.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/no-filename.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/preamble.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/special-chars-in-filename.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/workarounds.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/multipart.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-fixtures.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-json.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-octet-stream.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/common.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/integration/test-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-file.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-incoming-form.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-querystring-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/system/test-multi-video-upload.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/run.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-connection-aborted.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-content-transfer-encoding.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-issue-46.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/tools/base64.html ./node_modules/express/node_modules/connect/node_modules/formidable/test/unit/test-file.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/unit/test-incoming-form.js ./node_modules/express/node_modules/connect/node_modules/formidable/tool/record.js ./node_modules/express/node_modules/connect/node_modules/pause/.npmignore ./node_modules/express/node_modules/connect/node_modules/pause/History.md ./node_modules/express/node_modules/connect/node_modules/pause/index.js ./node_modules/express/node_modules/connect/node_modules/pause/Makefile ./node_modules/express/node_modules/connect/node_modules/pause/package.json ./node_modules/express/node_modules/connect/node_modules/pause/Readme.md ./node_modules/express/node_modules/connect/node_modules/qs/.gitmodules ./node_modules/express/node_modules/connect/node_modules/qs/.npmignore ./node_modules/express/node_modules/connect/node_modules/qs/index.js ./node_modules/express/node_modules/connect/node_modules/qs/package.json ./node_modules/express/node_modules/connect/node_modules/qs/Readme.md ./node_modules/express/node_modules/connect/package.json ./node_modules/express/node_modules/connect/test.js ./node_modules/express/node_modules/cookie/.npmignore ./node_modules/express/node_modules/cookie/.travis.yml ./node_modules/express/node_modules/cookie/index.js ./node_modules/express/node_modules/cookie/package.json ./node_modules/express/node_modules/cookie/README.md ./node_modules/express/node_modules/cookie/test/mocha.opts ./node_modules/express/node_modules/cookie/test/parse.js ./node_modules/express/node_modules/cookie/test/serialize.js ./node_modules/express/node_modules/cookie-signature/.npmignore ./node_modules/express/node_modules/cookie-signature/History.md ./node_modules/express/node_modules/cookie-signature/index.js ./node_modules/express/node_modules/cookie-signature/Makefile ./node_modules/express/node_modules/cookie-signature/package.json ./node_modules/express/node_modules/cookie-signature/Readme.md ./node_modules/express/node_modules/debug/.npmignore ./node_modules/express/node_modules/debug/component.json ./node_modules/express/node_modules/debug/debug.js ./node_modules/express/node_modules/debug/example/app.js ./node_modules/express/node_modules/debug/example/browser.html ./node_modules/express/node_modules/debug/example/wildcards.js ./node_modules/express/node_modules/debug/example/worker.js ./node_modules/express/node_modules/debug/History.md ./node_modules/express/node_modules/debug/index.js ./node_modules/express/node_modules/debug/lib/debug.js ./node_modules/express/node_modules/debug/package.json ./node_modules/express/node_modules/debug/Readme.md ./node_modules/express/node_modules/fresh/.npmignore ./node_modules/express/node_modules/fresh/index.js ./node_modules/express/node_modules/fresh/Makefile ./node_modules/express/node_modules/fresh/package.json ./node_modules/express/node_modules/fresh/Readme.md ./node_modules/express/node_modules/methods/index.js ./node_modules/express/node_modules/methods/package.json ./node_modules/express/node_modules/mkdirp/.npmignore ./node_modules/express/node_modules/mkdirp/.travis.yml ./node_modules/express/node_modules/mkdirp/examples/pow.js ./node_modules/express/node_modules/mkdirp/index.js ./node_modules/express/node_modules/mkdirp/LICENSE ./node_modules/express/node_modules/mkdirp/package.json ./node_modules/express/node_modules/mkdirp/README.markdown ./node_modules/express/node_modules/mkdirp/test/chmod.js ./node_modules/express/node_modules/mkdirp/test/clobber.js ./node_modules/express/node_modules/mkdirp/test/mkdirp.js ./node_modules/express/node_modules/mkdirp/test/perm.js ./node_modules/express/node_modules/mkdirp/test/perm_sync.js ./node_modules/express/node_modules/mkdirp/test/race.js ./node_modules/express/node_modules/mkdirp/test/rel.js ./node_modules/express/node_modules/mkdirp/test/return.js ./node_modules/express/node_modules/mkdirp/test/return_sync.js ./node_modules/express/node_modules/mkdirp/test/root.js ./node_modules/express/node_modules/mkdirp/test/sync.js ./node_modules/express/node_modules/mkdirp/test/umask.js ./node_modules/express/node_modules/mkdirp/test/umask_sync.js ./node_modules/express/node_modules/range-parser/.npmignore ./node_modules/express/node_modules/range-parser/History.md ./node_modules/express/node_modules/range-parser/index.js ./node_modules/express/node_modules/range-parser/Makefile ./node_modules/express/node_modules/range-parser/package.json ./node_modules/express/node_modules/range-parser/Readme.md ./node_modules/express/node_modules/send/.npmignore ./node_modules/express/node_modules/send/History.md ./node_modules/express/node_modules/send/index.js ./node_modules/express/node_modules/send/lib/send.js ./node_modules/express/node_modules/send/lib/utils.js ./node_modules/express/node_modules/send/Makefile ./node_modules/express/node_modules/send/node_modules/mime/LICENSE ./node_modules/express/node_modules/send/node_modules/mime/mime.js ./node_modules/express/node_modules/send/node_modules/mime/package.json ./node_modules/express/node_modules/send/node_modules/mime/README.md ./node_modules/express/node_modules/send/node_modules/mime/test.js ./node_modules/express/node_modules/send/node_modules/mime/types/mime.types ./node_modules/express/node_modules/send/node_modules/mime/types/node.types ./node_modules/express/node_modules/send/package.json ./node_modules/express/node_modules/send/Readme.md ./node_modules/express/package.json ./node_modules/express/Readme.md ./node_modules/mpns/lib/mpns.js ./node_modules/mpns/package.json ./node_modules/oauth/index.js ./node_modules/oauth/lib/oauth.js ./node_modules/oauth/lib/oauth2.js ./node_modules/oauth/lib/sha1.js ./node_modules/oauth/lib/_utils.js ./node_modules/oauth/LICENSE ./node_modules/oauth/package.json ./node_modules/pusher/index.js ./node_modules/pusher/lib/pusher.js ./node_modules/pusher/node_modules/request/aws.js ./node_modules/pusher/node_modules/request/aws2.js ./node_modules/pusher/node_modules/request/forever.js ./node_modules/pusher/node_modules/request/LICENSE ./node_modules/pusher/node_modules/request/main.js ./node_modules/pusher/node_modules/request/mimetypes.js ./node_modules/pusher/node_modules/request/oauth.js ./node_modules/pusher/node_modules/request/package.json ./node_modules/pusher/node_modules/request/tunnel.js ./node_modules/pusher/node_modules/request/uuid.js ./node_modules/pusher/node_modules/request/vendor/cookie/index.js ./node_modules/pusher/node_modules/request/vendor/cookie/jar.js ./node_modules/pusher/package.json ./node_modules/request/forever.js ./node_modules/request/LICENSE ./node_modules/request/main.js ./node_modules/request/mimetypes.js ./node_modules/request/oauth.js ./node_modules/request/package.json ./node_modules/request/uuid.js ./node_modules/request/vendor/cookie/index.js ./node_modules/request/vendor/cookie/jar.js ./node_modules/sax/lib/sax.js ./node_modules/sax/LICENSE ./node_modules/sax/package.json ./node_modules/sendgrid/index.js ./node_modules/sendgrid/lib/email.js ./node_modules/sendgrid/lib/file_handler.js ./node_modules/sendgrid/lib/sendgrid.js ./node_modules/sendgrid/lib/smtpapi_headers.js ./node_modules/sendgrid/lib/validation.js ./node_modules/sendgrid/MIT.LICENSE ./node_modules/sendgrid/node_modules/mime/LICENSE ./node_modules/sendgrid/node_modules/mime/mime.js ./node_modules/sendgrid/node_modules/mime/package.json ./node_modules/sendgrid/node_modules/mime/types/mime.types ./node_modules/sendgrid/node_modules/mime/types/node.types ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/sendmail.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/ses.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/smtp.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/stub.js ./node_modules/sendgrid/node_modules/nodemailer/lib/helpers.js ./node_modules/sendgrid/node_modules/nodemailer/lib/nodemailer.js ./node_modules/sendgrid/node_modules/nodemailer/lib/transport.js ./node_modules/sendgrid/node_modules/nodemailer/lib/wellknown.js ./node_modules/sendgrid/node_modules/nodemailer/lib/xoauth.js ./node_modules/sendgrid/node_modules/nodemailer/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/dkim.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/mailcomposer.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/punycode.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/urlfetch.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/content-types.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/index.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/mime-functions.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/index.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/client.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/pool.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/server.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/starttls.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/cert/cert.pem ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/cert/key.pem ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/mockup.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/rai.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/starttls.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/package.json ./node_modules/sendgrid/node_modules/nodemailer/package.json ./node_modules/sendgrid/node_modules/step/lib/step.js ./node_modules/sendgrid/node_modules/step/package.json ./node_modules/sendgrid/node_modules/underscore/index.js ./node_modules/sendgrid/node_modules/underscore/LICENSE ./node_modules/sendgrid/node_modules/underscore/package.json ./node_modules/sendgrid/node_modules/underscore/underscore.js ./node_modules/sendgrid/package.json ./node_modules/sqlserver/lib/sql.js ./node_modules/sqlserver/lib/sqlserver.native.js ./node_modules/sqlserver/lib/sqlserver.node ./node_modules/sqlserver/package.json ./node_modules/tripwire/lib/native/windows/x86/tripwire.node ./node_modules/tripwire/lib/tripwire.js ./node_modules/tripwire/LICENSE.txt ./node_modules/tripwire/package.json ./node_modules/underscore/LICENSE ./node_modules/underscore/package.json ./node_modules/underscore/underscore.js ./node_modules/underscore.string/lib/underscore.string.js ./node_modules/underscore.string/package.json ./node_modules/wns/lib/wns.js ./node_modules/wns/LICENSE.txt ./node_modules/wns/package.json ./node_modules/xml2js/lib/xml2js.js ./node_modules/xml2js/LICENSE ./node_modules/xml2js/node_modules/sax/lib/sax.js ./node_modules/xml2js/node_modules/sax/LICENSE ./node_modules/xml2js/node_modules/sax/package.json ./node_modules/xml2js/package.json ./node_modules/xmlbuilder/lib/index.js ./node_modules/xmlbuilder/lib/XMLBuilder.js ./node_modules/xmlbuilder/lib/XMLFragment.js ./node_modules/xmlbuilder/package.json ./runtime/core.js ./runtime/filehelpers.js ./runtime/jsonwebtoken.js ./runtime/logger.js ./runtime/logwriter.js ./runtime/metrics.js ./runtime/query/expressions.js ./runtime/query/expressionvisitor.js ./runtime/query/queryparser.js ./runtime/request/authentication/facebook.js ./runtime/request/authentication/google.js ./runtime/request/authentication/microsoftaccount.js ./runtime/request/authentication/twitter.js ./runtime/request/dataoperation.js ./runtime/request/datapipeline.js ./runtime/request/html/corshelper.js ./runtime/request/html/crossdomainhandler.js ./runtime/request/html/templates/crossdomainbridge.html ./runtime/request/html/templates/loginviaiframe.html ./runtime/request/html/templates/loginviaiframereceiver.html ./runtime/request/html/templates/loginviapostmessage.html ./runtime/request/html/templating.js ./runtime/request/loginhandler.js ./runtime/request/middleware/allowHandler.js ./runtime/request/middleware/authenticate.js ./runtime/request/middleware/authorize.js ./runtime/request/middleware/bodyParser.js ./runtime/request/middleware/errorHandler.js ./runtime/request/middleware/requestLimit.js ./runtime/request/request.js ./runtime/request/requesthandler.js ./runtime/request/schedulerhandler.js ./runtime/request/statushandler.js ./runtime/request/tablehandler.js ./runtime/resources.js ./runtime/script/apibuilder.js ./runtime/script/metadata.js ./runtime/script/push/notify-apns.js ./runtime/script/push/notify-gcm.js ./runtime/script/push/notify-mpns.js ./runtime/script/push/notify-wns.js ./runtime/script/push/notify.js ./runtime/script/scriptcache.js ./runtime/script/scripterror.js ./runtime/script/scriptloader.js ./runtime/script/scriptmanager.js ./runtime/script/scriptstate.js ./runtime/script/sqladapter.js ./runtime/script/table.js ./runtime/server.js ./runtime/statuscodes.js ./runtime/storage/sqlbooleanizer.js ./runtime/storage/sqlformatter.js ./runtime/storage/sqlhelpers.js ./runtime/storage/storage.js ./runtime/Zumo.Node.js ./static/client/MobileServices.Web-1.0.0.js ./static/client/MobileServices.Web-1.0.0.min.js ./static/default.htm ./static/robots.txt ./Web.config

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • I am getting an error with a oneToMany association when using annotations with gilead for hibernate

    - by user286630
    Hello Guys I'm using Gilead to persist my entities in my GWT project, im using hibernate annotations aswell. my problem is on my onetomany association.this is my User class that holds a reference to a list of FileLocations @Entity @Table(name = "yf_user_table") public class YFUser implements Serializable { @Id @GeneratedValue(strategy = GenerationType.AUTO) @Column(name = "user_id",nullable = false) private int userId; @Column(name = "username") private String username; @Column(name = "password") private String password; @Column(name = "email") private String email; @OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY) @JoinColumn(name="USER_ID") private List fileLocations = new ArrayList(); This is my file location class @Entity @Table(name = "fileLocationTable") public class FileLocation implements Serializable { @Id @GeneratedValue(strategy = GenerationType.AUTO) @Column(name = "locationId", updatable = false, nullable = false) private int ieId; @Column (name = "location") private String location; @ManyToOne(fetch=FetchType.LAZY) @JoinColumn(name="USER_ID", nullable=false) private YFUser uploadedUser; When i persist this data in a normal desktop application, it works fine , creates the tables and i can add and store data to it. but when i try to persist the data in my gwt application i get errors i will show them lower. this is my ServiceImpl class that extends PersistentRemoteService. public class TestServiceImpl extends PersistentRemoteService implements TestService { public static final String SESSION_USER = "UserWithinSession"; public TestServiceImpl(){ HibernateUtil util = new HibernateUtil(); util.setSessionFactory(com.example.server.HibernateUtil .getSessionFactory()); PersistentBeanManager pbm = new PersistentBeanManager(); pbm.setPersistenceUtil(util); pbm.setProxyStore(new StatelessProxyStore()); setBeanManager(pbm); } @Override public String registerUser(String username, String password, String email) { Session session = com.example.server.HibernateUtil.getSessionFactory().getCurrentSession(); session.beginTransaction(); YFUser newUser = new YFUser(); newUser.setUsername(username);newUser.setPassword(password);newUser.setEmail(email); session.save(newUser); session.getTransaction().commit(); return "Thank you For registering "+ username; } this is the error that im am getting. the error goes away when i remove my onetoManyRelationship and builds my session factory when i put it in , it on the line of buildsessionfactory in hibernate Util that it throws this exception. my hibernate util class is ok also. this is the error java.lang.NoSuchMethodError: javax.persistence.OneToMany.orphanRemoval()Z Mar 24, 2010 10:03:22 PM org.hibernate.cfg.annotations.Version <clinit> INFO: Hibernate Annotations 3.5.0-Beta-4 Mar 24, 2010 10:03:22 PM org.hibernate.cfg.Environment <clinit> INFO: Hibernate 3.5.0-Beta-4 Mar 24, 2010 10:03:22 PM org.hibernate.cfg.Environment <clinit> INFO: hibernate.properties not found Mar 24, 2010 10:03:22 PM org.hibernate.cfg.Environment buildBytecodeProvider INFO: Bytecode provider name : javassist Mar 24, 2010 10:03:22 PM org.hibernate.cfg.Environment <clinit> INFO: using JDK 1.4 java.sql.Timestamp handling Mar 24, 2010 10:03:22 PM org.hibernate.annotations.common.Version <clinit> INFO: Hibernate Commons Annotations 3.2.0-SNAPSHOT Mar 24, 2010 10:03:22 PM org.hibernate.cfg.Configuration configure INFO: configuring from resource: /hibernate.cfg.xml Mar 24, 2010 10:03:22 PM org.hibernate.cfg.Configuration getConfigurationInputStream INFO: Configuration resource: /hibernate.cfg.xml Mar 24, 2010 10:03:22 PM org.hibernate.cfg.Configuration doConfigure INFO: Configured SessionFactory: null Mar 24, 2010 10:03:22 PM org.hibernate.cfg.search.HibernateSearchEventListenerRegister enableHibernateSearch INFO: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. Mar 24, 2010 10:03:22 PM org.hibernate.cfg.AnnotationBinder bindClass INFO: Binding entity from annotated class: com.example.client.Entity1 Mar 24, 2010 10:03:22 PM org.hibernate.cfg.annotations.EntityBinder bindTable INFO: Bind entity com.example.client.Entity1 on table entityTable1 Mar 24, 2010 10:03:22 PM org.hibernate.cfg.AnnotationBinder bindClass INFO: Binding entity from annotated class: com.example.client.YFUser Mar 24, 2010 10:03:22 PM org.hibernate.cfg.annotations.EntityBinder bindTable INFO: Bind entity com.example.client.YFUser on table yf_user_table Initial SessionFactory creation failed.java.lang.NoSuchMethodError: javax.persistence.OneToMany.orphanRemoval()Z [WARN] Nested in javax.servlet.ServletException: init: java.lang.ExceptionInInitializerError at com.example.server.HibernateUtil.<clinit>(HibernateUtil.java:38) at com.example.server.TestServiceImpl.<init>(TestServiceImpl.java:29) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39 ) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at java.lang.Class.newInstance0(Class.java:355) at java.lang.Class.newInstance(Class.java:308) at org.mortbay.jetty.servlet.Holder.newInstance(Holder.java:153) at org.mortbay.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:339) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:463) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:362) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:729) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.handler.RequestLogHandler.handle(RequestLogHandler.java:49) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.Server.handle(Server.java:324) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:505) at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:843) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:647) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:211) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:380) at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:396) at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:488) Caused by: java.lang.NoSuchMethodError: javax.persistence.OneToMany.orphanRemoval()Z at org.hibernate.cfg.AnnotationBinder.processElementAnnotations(AnnotationBinder.java:1642) at org.hibernate.cfg.AnnotationBinder.bindClass(AnnotationBinder.java:772) at org.hibernate.cfg.AnnotationConfiguration.processArtifactsOfType(AnnotationConfiguration.java:629) at org.hibernate.cfg.AnnotationConfiguration.secondPassCompile(AnnotationConfiguration.java:350) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1373) at org.hibernate.cfg.AnnotationConfiguration.buildSessionFactory(AnnotationConfiguration.java:973) at com.example.server.HibernateUtil.<clinit>(HibernateUtil.java:33) ... 26 more [WARN] Nested in java.lang.ExceptionInInitializerError: java.lang.NoSuchMethodError: javax.persistence.OneToMany.orphanRemoval()Z at org.hibernate.cfg.AnnotationBinder.processElementAnnotations(AnnotationBinder.java:1642) at org.hibernate.cfg.AnnotationBinder.bindClass(AnnotationBinder.java:772) at org.hibernate.cfg.AnnotationConfiguration.processArtifactsOfType(AnnotationConfiguration.java:629) at org.hibernate.cfg.AnnotationConfiguration.secondPassCompile(AnnotationConfiguration.java:350) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1373) at org.hibernate.cfg.AnnotationConfiguration.buildSessionFactory(AnnotationConfiguration.java:973) at com.example.server.HibernateUtil.<clinit>(HibernateUtil.java:33) at com.example.server.TestServiceImpl.<init>(TestServiceImpl.java:29) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at java.lang.Class.newInstance0(Class.java:355) at java.lang.Class.newInstance(Class.java:308) at org.mortbay.jetty.servlet.Holder.newInstance(Holder.java:153) at org.mortbay.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:339) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:463) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:362) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:729) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.handler.RequestLogHandler.handle(RequestLogHandler.java:49) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.Server.handle(Server.java:324) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:505) at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:843) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:647) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:211) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:380) at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:396) at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:488)

    Read the article

  • Many to Many Logic in ASP.NET MVC

    - by Jonathan Stowell
    Hi All, I will restrict this to the three tables I am trying to work with Problem, Communications, and ProbComms. The scenario is that a Student may have many Problems concurrently which may affect their studies. Lecturers may have future communications with a student after an initial problem is logged, however as a Student may have multiple Problems the Lecturer may decide that the discussion they had is related to more than one Problem. Here is a screenshot of the LINQ representation of my DB: LINQ Screenshot At the moment in my StudentController I have a StudentFormViewModel Class: // //ViewModel Class public class StudentFormViewModel { IProbCommRepository probCommRepository; // Properties public Student Student { get; private set; } public IEnumerable<ProbComm> ProbComm { get; private set; } // // Dependency Injection enabled constructors public StudentFormViewModel(Student student, IEnumerable<ProbComm> probComm) : this(new ProbCommRepository()) { this.Student = student; this.ProbComm = probComm; } public StudentFormViewModel(IProbCommRepository pRepository) { probCommRepository = pRepository; } } When I go to the Students Detail Page this runs: public ActionResult Details(string id) { StudentFormViewModel viewdata = new StudentFormViewModel(studentRepository.GetStudent(id), probCommRepository.FindAllProblemComms(id)); if (viewdata == null) return View("NotFound"); else return View(viewdata); } The GetStudent works fine and returns an instance of the student to output on the page, below the student I output all problems logged against them, but underneath these problems I want to show the communications related to the Problem. The LINQ I am using for ProbComms is This is located in the Model class ProbCommRepository, and accessed via a IProbCommRepository interface: public IQueryable<ProbComm> FindAllProblemComms(string studentEmail) { return (from p in db.ProbComms where p.Problem.StudentEmail.Equals(studentEmail) orderby p.Problem.ProblemDateTime select p); } However for example if I have this data in the ProbComms table: ProblemID CommunicationID 1 1 1 2 The query returns two rows so I assume I somehow have to groupby Problem or ProblemID but I am not too sure how to do this with the way I have built things as the return type has to be ProbComm for the query as thats what Model class its located in. When it comes to the view the Details.aspx calls two partial views each passing the relevant view data through, StudentDetails works fine page: <%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<MitigatingCircumstances.Controllers.StudentFormViewModel>" %> <% Html.RenderPartial("StudentDetails", this.ViewData.Model.Student); %> <% Html.RenderPartial("StudentProblems", this.ViewData.Model.ProbComm); %> StudentProblems uses a foreach loop to loop through records in the Model and I am trying another foreach loop to output the communication details: <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<IEnumerable<MitigatingCircumstances.Models.ProbComm>>" %> <script type="text/javascript" language="javascript"> $(document).ready(function() { $("DIV.ContainerPanel > DIV.collapsePanelHeader > DIV.ArrowExpand").toggle( function() { $(this).parent().next("div.Content").show("slow"); $(this).attr("class", "ArrowClose"); }, function() { $(this).parent().next("div.Content").hide("slow"); $(this).attr("class", "ArrowExpand"); }); }); </script> <div class="studentProblems"> <% var i = 0; foreach (var item in Model) { %> <div id="ContainerPanel<%= i = i + 1 %>" class="ContainerPanel"> <div id="header<%= i = i + 1 %>" class="collapsePanelHeader"> <div id="dvHeaderText<%= i = i + 1 %>" class="HeaderContent"><%= Html.Encode(String.Format("{0:dd/MM/yyyy}", item.Problem.ProblemDateTime))%></div> <div id="dvArrow<%= i = i + 1 %>" class="ArrowExpand"></div> </div> <div id="dvContent<%= i = i + 1 %>" class="Content" style="display: none"> <p> Type: <%= Html.Encode(item.Problem.CommunicationType.TypeName) %> </p> <p> Problem Outline: <%= Html.Encode(item.Problem.ProblemOutline)%> </p> <p> Mitigating Circumstance Form: <%= Html.Encode(item.Problem.MCF)%> </p> <p> Mitigating Circumstance Level: <%= Html.Encode(item.Problem.MitigatingCircumstanceLevel.MCLevel)%> </p> <p> Absent From: <%= Html.Encode(String.Format("{0:g}", item.Problem.AbsentFrom))%> </p> <p> Absent Until: <%= Html.Encode(String.Format("{0:g}", item.Problem.AbsentUntil))%> </p> <p> Requested Follow Up: <%= Html.Encode(String.Format("{0:g}", item.Problem.RequestedFollowUp))%> </p> <p>Problem Communications</p> <% foreach (var comm in Model) { %> <p> <% if (item.Problem.ProblemID == comm.ProblemID) { %> <%= Html.Encode(comm.ProblemCommunication.CommunicationOutline)%> <% } %> </p> <% } %> </div> </div> <br /> <% } %> </div> The issue is that using the example data before the Model has two records for the same problem as there are two communications for that problem, therefore duplicating the output. Any help with this would be gratefully appreciated. Thanks, Jon

    Read the article

  • Metro: Creating an IndexedDbDataSource for WinJS

    - by Stephen.Walther
    The goal of this blog entry is to describe how you can create custom data sources which you can use with the controls in the WinJS library. In particular, I explain how you can create an IndexedDbDataSource which you can use to store and retrieve data from an IndexedDB database. If you want to skip ahead, and ignore all of the fascinating content in-between, I’ve included the complete code for the IndexedDbDataSource at the very bottom of this blog entry. What is IndexedDB? IndexedDB is a database in the browser. You can use the IndexedDB API with all modern browsers including Firefox, Chrome, and Internet Explorer 10. And, of course, you can use IndexedDB with Metro style apps written with JavaScript. If you need to persist data in a Metro style app written with JavaScript then IndexedDB is a good option. Each Metro app can only interact with its own IndexedDB databases. And, IndexedDB provides you with transactions, indices, and cursors – the elements of any modern database. An IndexedDB database might be different than the type of database that you normally use. An IndexedDB database is an object-oriented database and not a relational database. Instead of storing data in tables, you store data in object stores. You store JavaScript objects in an IndexedDB object store. You create new IndexedDB object stores by handling the upgradeneeded event when you attempt to open a connection to an IndexedDB database. For example, here’s how you would both open a connection to an existing database named TasksDB and create the TasksDB database when it does not already exist: var reqOpen = window.indexedDB.open(“TasksDB”, 2); reqOpen.onupgradeneeded = function (evt) { var newDB = evt.target.result; newDB.createObjectStore("tasks", { keyPath: "id", autoIncrement: true }); }; reqOpen.onsuccess = function () { var db = reqOpen.result; // Do something with db }; When you call window.indexedDB.open(), and the database does not already exist, then the upgradeneeded event is raised. In the code above, the upgradeneeded handler creates a new object store named tasks. The new object store has an auto-increment column named id which acts as the primary key column. If the database already exists with the right version, and you call window.indexedDB.open(), then the success event is raised. At that point, you have an open connection to the existing database and you can start doing something with the database. You use asynchronous methods to interact with an IndexedDB database. For example, the following code illustrates how you would add a new object to the tasks object store: var transaction = db.transaction(“tasks”, “readwrite”); var reqAdd = transaction.objectStore(“tasks”).add({ name: “Feed the dog” }); reqAdd.onsuccess = function() { // Tasks added successfully }; The code above creates a new database transaction, adds a new task to the tasks object store, and handles the success event. If the new task gets added successfully then the success event is raised. Creating a WinJS IndexedDbDataSource The most powerful control in the WinJS library is the ListView control. This is the control that you use to display a collection of items. If you want to display data with a ListView control, you need to bind the control to a data source. The WinJS library includes two objects which you can use as a data source: the List object and the StorageDataSource object. The List object enables you to represent a JavaScript array as a data source and the StorageDataSource enables you to represent the file system as a data source. If you want to bind an IndexedDB database to a ListView then you have a choice. You can either dump the items from the IndexedDB database into a List object or you can create a custom data source. I explored the first approach in a previous blog entry. In this blog entry, I explain how you can create a custom IndexedDB data source. Implementing the IListDataSource Interface You create a custom data source by implementing the IListDataSource interface. This interface contains the contract for the methods which the ListView needs to interact with a data source. The easiest way to implement the IListDataSource interface is to derive a new object from the base VirtualizedDataSource object. The VirtualizedDataSource object requires a data adapter which implements the IListDataAdapter interface. Yes, because of the number of objects involved, this is a little confusing. Your code ends up looking something like this: var IndexedDbDataSource = WinJS.Class.derive( WinJS.UI.VirtualizedDataSource, function (dbName, dbVersion, objectStoreName, upgrade, error) { this._adapter = new IndexedDbDataAdapter(dbName, dbVersion, objectStoreName, upgrade, error); this._baseDataSourceConstructor(this._adapter); }, { nuke: function () { this._adapter.nuke(); }, remove: function (key) { this._adapter.removeInternal(key); } } ); The code above is used to create a new class named IndexedDbDataSource which derives from the base VirtualizedDataSource class. In the constructor for the new class, the base class _baseDataSourceConstructor() method is called. A data adapter is passed to the _baseDataSourceConstructor() method. The code above creates a new method exposed by the IndexedDbDataSource named nuke(). The nuke() method deletes all of the objects from an object store. The code above also overrides a method named remove(). Our derived remove() method accepts any type of key and removes the matching item from the object store. Almost all of the work of creating a custom data source goes into building the data adapter class. The data adapter class implements the IListDataAdapter interface which contains the following methods: · change() · getCount() · insertAfter() · insertAtEnd() · insertAtStart() · insertBefore() · itemsFromDescription() · itemsFromEnd() · itemsFromIndex() · itemsFromKey() · itemsFromStart() · itemSignature() · moveAfter() · moveBefore() · moveToEnd() · moveToStart() · remove() · setNotificationHandler() · compareByIdentity Fortunately, you are not required to implement all of these methods. You only need to implement the methods that you actually need. In the case of the IndexedDbDataSource, I implemented the getCount(), itemsFromIndex(), insertAtEnd(), and remove() methods. If you are creating a read-only data source then you really only need to implement the getCount() and itemsFromIndex() methods. Implementing the getCount() Method The getCount() method returns the total number of items from the data source. So, if you are storing 10,000 items in an object store then this method would return the value 10,000. Here’s how I implemented the getCount() method: getCount: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore().then(function (store) { var reqCount = store.count(); reqCount.onerror = that._error; reqCount.onsuccess = function (evt) { complete(evt.target.result); }; }); }); } The first thing that you should notice is that the getCount() method returns a WinJS promise. This is a requirement. The getCount() method is asynchronous which is a good thing because all of the IndexedDB methods (at least the methods implemented in current browsers) are also asynchronous. The code above retrieves an object store and then uses the IndexedDB count() method to get a count of the items in the object store. The value is returned from the promise by calling complete(). Implementing the itemsFromIndex method When a ListView displays its items, it calls the itemsFromIndex() method. By default, it calls this method multiple times to get different ranges of items. Three parameters are passed to the itemsFromIndex() method: the requestIndex, countBefore, and countAfter parameters. The requestIndex indicates the index of the item from the database to show. The countBefore and countAfter parameters represent hints. These are integer values which represent the number of items before and after the requestIndex to retrieve. Again, these are only hints and you can return as many items before and after the request index as you please. Here’s how I implemented the itemsFromIndex method: itemsFromIndex: function (requestIndex, countBefore, countAfter) { var that = this; return new WinJS.Promise(function (complete, error) { that.getCount().then(function (count) { if (requestIndex >= count) { return WinJS.Promise.wrapError(new WinJS.ErrorFromName(WinJS.UI.FetchError.doesNotExist)); } var startIndex = Math.max(0, requestIndex - countBefore); var endIndex = Math.min(count, requestIndex + countAfter + 1); that._getObjectStore().then(function (store) { var index = 0; var items = []; var req = store.openCursor(); req.onerror = that._error; req.onsuccess = function (evt) { var cursor = evt.target.result; if (index < startIndex) { index = startIndex; cursor.advance(startIndex); return; } if (cursor && index < endIndex) { index++; items.push({ key: cursor.value[store.keyPath].toString(), data: cursor.value }); cursor.continue(); return; } results = { items: items, offset: requestIndex - startIndex, totalCount: count }; complete(results); }; }); }); }); } In the code above, a cursor is used to iterate through the objects in an object store. You fetch the next item in the cursor by calling either the cursor.continue() or cursor.advance() method. The continue() method moves forward by one object and the advance() method moves forward a specified number of objects. Each time you call continue() or advance(), the success event is raised again. If the cursor is null then you know that you have reached the end of the cursor and you can return the results. Some things to be careful about here. First, the return value from the itemsFromIndex() method must implement the IFetchResult interface. In particular, you must return an object which has an items, offset, and totalCount property. Second, each item in the items array must implement the IListItem interface. Each item should have a key and a data property. Implementing the insertAtEnd() Method When creating the IndexedDbDataSource, I wanted to go beyond creating a simple read-only data source and support inserting and deleting objects. If you want to support adding new items with your data source then you need to implement the insertAtEnd() method. Here’s how I implemented the insertAtEnd() method for the IndexedDbDataSource: insertAtEnd:function(unused, data) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function(store) { var reqAdd = store.add(data); reqAdd.onerror = that._error; reqAdd.onsuccess = function (evt) { var reqGet = store.get(evt.target.result); reqGet.onerror = that._error; reqGet.onsuccess = function (evt) { var newItem = { key:evt.target.result[store.keyPath].toString(), data:evt.target.result } complete(newItem); }; }; }); }); } When implementing the insertAtEnd() method, you need to be careful to return an object which implements the IItem interface. In particular, you should return an object that has a key and a data property. The key must be a string and it uniquely represents the new item added to the data source. The value of the data property represents the new item itself. Implementing the remove() Method Finally, you use the remove() method to remove an item from the data source. You call the remove() method with the key of the item which you want to remove. Implementing the remove() method in the case of the IndexedDbDataSource was a little tricky. The problem is that an IndexedDB object store uses an integer key and the VirtualizedDataSource requires a string key. For that reason, I needed to override the remove() method in the derived IndexedDbDataSource class like this: var IndexedDbDataSource = WinJS.Class.derive( WinJS.UI.VirtualizedDataSource, function (dbName, dbVersion, objectStoreName, upgrade, error) { this._adapter = new IndexedDbDataAdapter(dbName, dbVersion, objectStoreName, upgrade, error); this._baseDataSourceConstructor(this._adapter); }, { nuke: function () { this._adapter.nuke(); }, remove: function (key) { this._adapter.removeInternal(key); } } ); When you call remove(), you end up calling a method of the IndexedDbDataAdapter named removeInternal() . Here’s what the removeInternal() method looks like: setNotificationHandler: function (notificationHandler) { this._notificationHandler = notificationHandler; }, removeInternal: function(key) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqDelete = store.delete (key); reqDelete.onerror = that._error; reqDelete.onsuccess = function (evt) { that._notificationHandler.removed(key.toString()); complete(); }; }); }); } The removeInternal() method calls the IndexedDB delete() method to delete an item from the object store. If the item is deleted successfully then the _notificationHandler.remove() method is called. Because we are not implementing the standard IListDataAdapter remove() method, we need to notify the data source (and the ListView control bound to the data source) that an item has been removed. The way that you notify the data source is by calling the _notificationHandler.remove() method. Notice that we get the _notificationHandler in the code above by implementing another method in the IListDataAdapter interface: the setNotificationHandler() method. You can raise the following types of notifications using the _notificationHandler: · beginNotifications() · changed() · endNotifications() · inserted() · invalidateAll() · moved() · removed() · reload() These methods are all part of the IListDataNotificationHandler interface in the WinJS library. Implementing the nuke() Method I wanted to implement a method which would remove all of the items from an object store. Therefore, I created a method named nuke() which calls the IndexedDB clear() method: nuke: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqClear = store.clear(); reqClear.onerror = that._error; reqClear.onsuccess = function (evt) { that._notificationHandler.reload(); complete(); }; }); }); } Notice that the nuke() method calls the _notificationHandler.reload() method to notify the ListView to reload all of the items from its data source. Because we are implementing a custom method here, we need to use the _notificationHandler to send an update. Using the IndexedDbDataSource To illustrate how you can use the IndexedDbDataSource, I created a simple task list app. You can add new tasks, delete existing tasks, and nuke all of the tasks. You delete an item by selecting an item (swipe or right-click) and clicking the Delete button. Here’s the HTML page which contains the ListView, the form for adding new tasks, and the buttons for deleting and nuking tasks: <!DOCTYPE html> <html> <head> <meta charset="utf-8" /> <title>DataSources</title> <!-- WinJS references --> <link href="//Microsoft.WinJS.1.0.RC/css/ui-dark.css" rel="stylesheet" /> <script src="//Microsoft.WinJS.1.0.RC/js/base.js"></script> <script src="//Microsoft.WinJS.1.0.RC/js/ui.js"></script> <!-- DataSources references --> <link href="indexedDb.css" rel="stylesheet" /> <script type="text/javascript" src="indexedDbDataSource.js"></script> <script src="indexedDb.js"></script> </head> <body> <div id="tmplTask" data-win-control="WinJS.Binding.Template"> <div class="taskItem"> Id: <span data-win-bind="innerText:id"></span> <br /><br /> Name: <span data-win-bind="innerText:name"></span> </div> </div> <div id="lvTasks" data-win-control="WinJS.UI.ListView" data-win-options="{ itemTemplate: select('#tmplTask'), selectionMode: 'single' }"></div> <form id="frmAdd"> <fieldset> <legend>Add Task</legend> <label>New Task</label> <input id="inputTaskName" required /> <button>Add</button> </fieldset> </form> <button id="btnNuke">Nuke</button> <button id="btnDelete">Delete</button> </body> </html> And here is the JavaScript code for the TaskList app: /// <reference path="//Microsoft.WinJS.1.0.RC/js/base.js" /> /// <reference path="//Microsoft.WinJS.1.0.RC/js/ui.js" /> function init() { WinJS.UI.processAll().done(function () { var lvTasks = document.getElementById("lvTasks").winControl; // Bind the ListView to its data source var tasksDataSource = new DataSources.IndexedDbDataSource("TasksDB", 1, "tasks", upgrade); lvTasks.itemDataSource = tasksDataSource; // Wire-up Add, Delete, Nuke buttons document.getElementById("frmAdd").addEventListener("submit", function (evt) { evt.preventDefault(); tasksDataSource.beginEdits(); tasksDataSource.insertAtEnd(null, { name: document.getElementById("inputTaskName").value }).done(function (newItem) { tasksDataSource.endEdits(); document.getElementById("frmAdd").reset(); lvTasks.ensureVisible(newItem.index); }); }); document.getElementById("btnDelete").addEventListener("click", function () { if (lvTasks.selection.count() == 1) { lvTasks.selection.getItems().done(function (items) { tasksDataSource.remove(items[0].data.id); }); } }); document.getElementById("btnNuke").addEventListener("click", function () { tasksDataSource.nuke(); }); // This method is called to initialize the IndexedDb database function upgrade(evt) { var newDB = evt.target.result; newDB.createObjectStore("tasks", { keyPath: "id", autoIncrement: true }); } }); } document.addEventListener("DOMContentLoaded", init); The IndexedDbDataSource is created and bound to the ListView control with the following two lines of code: var tasksDataSource = new DataSources.IndexedDbDataSource("TasksDB", 1, "tasks", upgrade); lvTasks.itemDataSource = tasksDataSource; The IndexedDbDataSource is created with four parameters: the name of the database to create, the version of the database to create, the name of the object store to create, and a function which contains code to initialize the new database. The upgrade function creates a new object store named tasks with an auto-increment property named id: function upgrade(evt) { var newDB = evt.target.result; newDB.createObjectStore("tasks", { keyPath: "id", autoIncrement: true }); } The Complete Code for the IndexedDbDataSource Here’s the complete code for the IndexedDbDataSource: (function () { /************************************************ * The IndexedDBDataAdapter enables you to work * with a HTML5 IndexedDB database. *************************************************/ var IndexedDbDataAdapter = WinJS.Class.define( function (dbName, dbVersion, objectStoreName, upgrade, error) { this._dbName = dbName; // database name this._dbVersion = dbVersion; // database version this._objectStoreName = objectStoreName; // object store name this._upgrade = upgrade; // database upgrade script this._error = error || function (evt) { console.log(evt.message); }; }, { /******************************************* * IListDataAdapter Interface Methods ********************************************/ getCount: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore().then(function (store) { var reqCount = store.count(); reqCount.onerror = that._error; reqCount.onsuccess = function (evt) { complete(evt.target.result); }; }); }); }, itemsFromIndex: function (requestIndex, countBefore, countAfter) { var that = this; return new WinJS.Promise(function (complete, error) { that.getCount().then(function (count) { if (requestIndex >= count) { return WinJS.Promise.wrapError(new WinJS.ErrorFromName(WinJS.UI.FetchError.doesNotExist)); } var startIndex = Math.max(0, requestIndex - countBefore); var endIndex = Math.min(count, requestIndex + countAfter + 1); that._getObjectStore().then(function (store) { var index = 0; var items = []; var req = store.openCursor(); req.onerror = that._error; req.onsuccess = function (evt) { var cursor = evt.target.result; if (index < startIndex) { index = startIndex; cursor.advance(startIndex); return; } if (cursor && index < endIndex) { index++; items.push({ key: cursor.value[store.keyPath].toString(), data: cursor.value }); cursor.continue(); return; } results = { items: items, offset: requestIndex - startIndex, totalCount: count }; complete(results); }; }); }); }); }, insertAtEnd:function(unused, data) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function(store) { var reqAdd = store.add(data); reqAdd.onerror = that._error; reqAdd.onsuccess = function (evt) { var reqGet = store.get(evt.target.result); reqGet.onerror = that._error; reqGet.onsuccess = function (evt) { var newItem = { key:evt.target.result[store.keyPath].toString(), data:evt.target.result } complete(newItem); }; }; }); }); }, setNotificationHandler: function (notificationHandler) { this._notificationHandler = notificationHandler; }, /***************************************** * IndexedDbDataSource Method ******************************************/ removeInternal: function(key) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqDelete = store.delete (key); reqDelete.onerror = that._error; reqDelete.onsuccess = function (evt) { that._notificationHandler.removed(key.toString()); complete(); }; }); }); }, nuke: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqClear = store.clear(); reqClear.onerror = that._error; reqClear.onsuccess = function (evt) { that._notificationHandler.reload(); complete(); }; }); }); }, /******************************************* * Private Methods ********************************************/ _ensureDbOpen: function () { var that = this; // Try to get cached Db if (that._cachedDb) { return WinJS.Promise.wrap(that._cachedDb); } // Otherwise, open the database return new WinJS.Promise(function (complete, error, progress) { var reqOpen = window.indexedDB.open(that._dbName, that._dbVersion); reqOpen.onerror = function (evt) { error(); }; reqOpen.onupgradeneeded = function (evt) { that._upgrade(evt); that._notificationHandler.invalidateAll(); }; reqOpen.onsuccess = function () { that._cachedDb = reqOpen.result; complete(that._cachedDb); }; }); }, _getObjectStore: function (type) { type = type || "readonly"; var that = this; return new WinJS.Promise(function (complete, error) { that._ensureDbOpen().then(function (db) { var transaction = db.transaction(that._objectStoreName, type); complete(transaction.objectStore(that._objectStoreName)); }); }); }, _get: function (key) { return new WinJS.Promise(function (complete, error) { that._getObjectStore().done(function (store) { var reqGet = store.get(key); reqGet.onerror = that._error; reqGet.onsuccess = function (item) { complete(item); }; }); }); } } ); var IndexedDbDataSource = WinJS.Class.derive( WinJS.UI.VirtualizedDataSource, function (dbName, dbVersion, objectStoreName, upgrade, error) { this._adapter = new IndexedDbDataAdapter(dbName, dbVersion, objectStoreName, upgrade, error); this._baseDataSourceConstructor(this._adapter); }, { nuke: function () { this._adapter.nuke(); }, remove: function (key) { this._adapter.removeInternal(key); } } ); WinJS.Namespace.define("DataSources", { IndexedDbDataSource: IndexedDbDataSource }); })(); Summary In this blog post, I provided an overview of how you can create a new data source which you can use with the WinJS library. I described how you can create an IndexedDbDataSource which you can use to bind a ListView control to an IndexedDB database. While describing how you can create a custom data source, I explained how you can implement the IListDataAdapter interface. You also learned how to raise notifications — such as a removed or invalidateAll notification — by taking advantage of the methods of the IListDataNotificationHandler interface.

    Read the article

  • Many to Many with LINQ-To-Sql and ASP.NET MVC

    - by Jonathan Stowell
    Hi All, I will restrict this to the three tables I am trying to work with Problem, Communications, and ProbComms. The scenario is that a Student may have many Problems concurrently which may affect their studies. Lecturers may have future communications with a student after an initial problem is logged, however as a Student may have multiple Problems the Lecturer may decide that the discussion they had is related to more than one Problem. Here is a screenshot of the LINQ-To-Sql representation of my DB: LINQ-To-Sql Screenshot At the moment in my StudentController I have a StudentFormViewModel Class: // //ViewModel Class public class StudentFormViewModel { IProbCommRepository probCommRepository; // Properties public Student Student { get; private set; } public IEnumerable<ProbComm> ProbComm { get; private set; } // // Dependency Injection enabled constructors public StudentFormViewModel(Student student, IEnumerable<ProbComm> probComm) : this(new ProbCommRepository()) { this.Student = student; this.ProbComm = probComm; } public StudentFormViewModel(IProbCommRepository pRepository) { probCommRepository = pRepository; } } When I go to the Students Detail Page this runs: public ActionResult Details(string id) { StudentFormViewModel viewdata = new StudentFormViewModel(studentRepository.GetStudent(id), probCommRepository.FindAllProblemComms(id)); if (viewdata == null) return View("NotFound"); else return View(viewdata); } The GetStudent works fine and returns an instance of the student to output on the page, below the student I output all problems logged against them, but underneath these problems I want to show the communications related to the Problem. The LINQ I am using for ProbComms is This is located in the Model class ProbCommRepository, and accessed via a IProbCommRepository interface: public IQueryable<ProbComm> FindAllProblemComms(string studentEmail) { return (from p in db.ProbComms where p.Problem.StudentEmail.Equals(studentEmail) orderby p.Problem.ProblemDateTime select p); } However for example if I have this data in the ProbComms table: ProblemID CommunicationID 1 1 1 2 The query returns two rows so I assume I somehow have to groupby Problem or ProblemID but I am not too sure how to do this with the way I have built things as the return type has to be ProbComm for the query as thats what Model class its located in. When it comes to the view the Details.aspx calls two partial views each passing the relevant view data through, StudentDetails works fine page: <%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<MitigatingCircumstances.Controllers.StudentFormViewModel>" %> <% Html.RenderPartial("StudentDetails", this.ViewData.Model.Student); %> <% Html.RenderPartial("StudentProblems", this.ViewData.Model.ProbComm); %> StudentProblems uses a foreach loop to loop through records in the Model and I am trying another foreach loop to output the communication details: <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<IEnumerable<MitigatingCircumstances.Models.ProbComm>>" %> <script type="text/javascript" language="javascript"> $(document).ready(function() { $("DIV.ContainerPanel > DIV.collapsePanelHeader > DIV.ArrowExpand").toggle( function() { $(this).parent().next("div.Content").show("slow"); $(this).attr("class", "ArrowClose"); }, function() { $(this).parent().next("div.Content").hide("slow"); $(this).attr("class", "ArrowExpand"); }); }); </script> <div class="studentProblems"> <% var i = 0; foreach (var item in Model) { %> <div id="ContainerPanel<%= i = i + 1 %>" class="ContainerPanel"> <div id="header<%= i = i + 1 %>" class="collapsePanelHeader"> <div id="dvHeaderText<%= i = i + 1 %>" class="HeaderContent"><%= Html.Encode(String.Format("{0:dd/MM/yyyy}", item.Problem.ProblemDateTime))%></div> <div id="dvArrow<%= i = i + 1 %>" class="ArrowExpand"></div> </div> <div id="dvContent<%= i = i + 1 %>" class="Content" style="display: none"> <p> Type: <%= Html.Encode(item.Problem.CommunicationType.TypeName) %> </p> <p> Problem Outline: <%= Html.Encode(item.Problem.ProblemOutline)%> </p> <p> Mitigating Circumstance Form: <%= Html.Encode(item.Problem.MCF)%> </p> <p> Mitigating Circumstance Level: <%= Html.Encode(item.Problem.MitigatingCircumstanceLevel.MCLevel)%> </p> <p> Absent From: <%= Html.Encode(String.Format("{0:g}", item.Problem.AbsentFrom))%> </p> <p> Absent Until: <%= Html.Encode(String.Format("{0:g}", item.Problem.AbsentUntil))%> </p> <p> Requested Follow Up: <%= Html.Encode(String.Format("{0:g}", item.Problem.RequestedFollowUp))%> </p> <p>Problem Communications</p> <% foreach (var comm in Model) { %> <p> <% if (item.Problem.ProblemID == comm.ProblemID) { %> <%= Html.Encode(comm.ProblemCommunication.CommunicationOutline)%> <% } %> </p> <% } %> </div> </div> <br /> <% } %> </div> The issue is that using the example data before the Model has two records for the same problem as there are two communications for that problem, therefore duplicating the output. Any help with this would be gratefully appreciated. Thanks, Jon

    Read the article

  • Block Hover Effect - Why doesn't it work correctly in FF3.6?

    - by Brian Ojeda
    Why doesn't following code work correctly in FireFox 3.6? I have tested in IE7, IE8, and Chrome with out any issues. Issue: The first block hover link (the table's 3rd row) doesn't apply the same style/effect as the following below it. Notes: I am trying to create my own table framework. This project is something I am doing to learn more about CSS. Before I started, I thought I knew a lot about CSS. However, to my surprise I was wrong. Who knew? Moving on... As side note, I do not want to take the time to support IE6. So, if you see a problem related IE6, please don't waste your time telling. One another side note, the following style script and HTML listed when this question is strip-down/bare-bone of the complete CSS/HTML. It should be enough to assist me. CSS: /* Main Properties */ .ojtable{display:block;clear:both; margin-left:auto;margin-right:auto; margin-top:0px; width:650px;} .ojtable-row, .ojtable-head {display:block;clear:both;position:relative; margin-left:0px;margin-right:0px;padding:0px;} .col-1, .col-2, .col-3, .col-4, .col-5, .col-6, .col-7, .col-8, .col-9, .col-10, .col-11, .col-12, .col-13, .col-1-b1, .col-2-b1, .col-3-b1, .col-4-b1, .col-5-b1, .col-6-b1, .col-7-b1, .col-8-b1, .col-9-b1, .col-10-b1, .col-11-b1, .col-12-b1, .col-13-b1 {display:block;float:left;position:relative; margin-left:0px;margin-right:0px;padding:0px 2px;} /* Border */ .border-b1{border:solid #000000; border-width:0 0 1px 0;} .border-ltr{border:solid #000000; border-width:1px 1px 0 1px;} /* Header */ .ojtable-row{width:100%;} .ojtable-head{width:100%;} /* No Border*/ .col-2{width:96px;} /* Border: 1px */ .col-2-b1{width:95px;} .col-7-b1{width:345px;} /*--- Clear Floated Elements ---*/ /* Credit: http://sonspring.com/journal/clearing-floats */ .clear { clear: both; display: block; overflow: hidden; visibility: hidden; width: 0; height: 0; } /* Credit: http://perishablepress.com/press/2008/02/05/ lessons-learned-concerning-the-clearfix-css-hack/ */ .clearfix:after { visibility: hidden; display: block; font-size: 0; content: " "; clear: both; height: 0; } .clearfix { display:inline-block; } /* start commented backslash hack \*/ * html .clearfix { height: 1%; } .clearfix { display: block; } /* close commented backslash hack */ /*--- Hover Effect for the Tables ---*/ a {text-decoration:none;} * .ojtable a .ojtable-row{width:650px; display:block; text-decoration:none;} * html .ojtable a .ojtable-row {width:650px;}/* Hover Fix for IE */ .ojtable a:hover .ojtable-row{background:#AAAAAA; cursor:pointer;} HTML: <div class="ojtable border-ltr clearfix"> <div class="ojtable-row border-b1 clearfix"> <div class="col-13">Newest Blogs</div> </div> <div class="ojtable-row border-b1 clearfix"> <div class="col-7-b1 border-r1">Name</div> <div class="col-4-b1 border-r1">Creater's Name</div> <div class="col-2">Dated Created</div> </div> <a href="#"><div class="ojtable-row border-b1 clearfix"> <div class="col-7-b1 border-r1">Why jQuery?</div> <div class="col-4-b1 border-r1">Gramcracker</div> <div class="col-2">Mar 11 2010</div> </div></a> <a href="#"><div class="ojtable-row border-b1 clearfix"> <div class="col-7-b1 border-r1">Thank You For Your Help</div> <div class="col-4-b1 border-r1">O'Hater</div> <div class="col-2">Nov 2 2009</div> </div></a> <a href="#"><div class="ojtable-row border-b1 clearfix"> <div class="col-7-b1 border-r1">Click Me! Hahaha!</div> <div class="col-4-b1 border-r1">Brian Ojeda</div> <div class="col-2">Nov 29 2008</div> </div></a> <a href="#"><div class="ojtable-row border-b1 clearfix"> <div class="col-7-b1 border-r1">Moment of Zen</div> <div class="col-4-b1 border-r1">Jedi</div> <div class="col-2">Mar 11 2010</div> </div></a> <a href="#"><div class="ojtable-row border-b1 clearfix"> <div class="col-7-b1 border-r1"></div> <div class="col-4-b1 border-r1">SGT OJ</div> <div class="col-2">Mar 11 2010</div> </div></a> </div> <!-- End of Table --> PS: Thank you for assistant, if you do choose to help.

    Read the article

  • Error: No mapping exists from object type....

    - by jakesankey
    Here is the code for my simple parsing application. I am getting an error that states 'No mapping exists from type System.Text.RegularExpressions.Match to a known managed provider native type'. This started to occur when I switched from using Split('_') to RegEx.Match for defining RNumberE, RNumberD, etc. Any guidance is appreciated. using System; using System.Data; using System.Data.SQLite; using System.IO; using System.Text.RegularExpressions; using System.Threading; using System.Collections.Generic; using System.Linq; using System.Data.SqlClient; namespace JohnDeereCMMDataParser { internal class Program { public static List<string> GetImportedFileList() { List<string> ImportedFiles = new List<string>(); using (SqlConnection connect = new SqlConnection(@"Server=FRXSQLDEV;Database=RX_CMMData;Integrated Security=YES")) { connect.Open(); using (SqlCommand fmd = connect.CreateCommand()) { fmd.CommandText = @"SELECT FileName FROM CMMData;"; fmd.CommandType = CommandType.Text; SqlDataReader r = fmd.ExecuteReader(); while (r.Read()) { ImportedFiles.Add(Convert.ToString(r["FileName"])); } } } return ImportedFiles; } private static void Main(string[] args) { Console.Title = "John Deere CMM Data Parser"; Console.WriteLine("Preparing CMM Data Parser... done"); Console.WriteLine("Scanning for new CMM data..."); Console.ForegroundColor = ConsoleColor.Gray; using (SqlConnection con = new SqlConnection(@"Server=FRXSQLDEV;Database=RX_CMMData;Integrated Security=YES")) { con.Open(); using (SqlCommand insertCommand = con.CreateCommand()) { Console.WriteLine("Connecting to SQL server..."); SqlCommand cmdd = con.CreateCommand(); string[] files = Directory.GetFiles(@"C:\Documents and Settings\js91162\Desktop\CMM WENZEL\", "*_*_*.txt", SearchOption.AllDirectories); List<string> ImportedFiles = GetImportedFileList(); insertCommand.Parameters.Add(new SqlParameter("@FeatType", DbType.String)); insertCommand.Parameters.Add(new SqlParameter("@FeatName", DbType.String)); insertCommand.Parameters.Add(new SqlParameter("@Axis", DbType.String)); insertCommand.Parameters.Add(new SqlParameter("@Actual", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@Nominal", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@Dev", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@TolMin", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@TolPlus", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@OutOfTol", DbType.Decimal)); foreach (string file in files.Except(ImportedFiles)) { var FileNameExt1 = Path.GetFileName(file); cmdd.Parameters.Clear(); cmdd.Parameters.Add(new SqlParameter("@FileExt", FileNameExt1)); cmdd.CommandText = @" IF (EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_SCHEMA = 'RX_CMMData' AND TABLE_NAME = 'CMMData')) BEGIN SELECT COUNT(*) FROM CMMData WHERE FileName = @FileExt; END"; int count = Convert.ToInt32(cmdd.ExecuteScalar()); con.Close(); con.Open(); if (count == 0) { Console.WriteLine("Preparing to parse CMM data for SQL import..."); if (file.Count(c => c == '_') > 5) continue; insertCommand.CommandText = @" INSERT INTO CMMData (FeatType, FeatName, Axis, Actual, Nominal, Dev, TolMin, TolPlus, OutOfTol, PartNumber, CMMNumber, Date, FileName) VALUES (@FeatType, @FeatName, @Axis, @Actual, @Nominal, @Dev, @TolMin, @TolPlus, @OutOfTol, @PartNumber, @CMMNumber, @Date, @FileName);"; string FileNameExt = Path.GetFullPath(file); string RNumber = Path.GetFileNameWithoutExtension(file); int index2 = RNumber.IndexOf("~"); Match RNumberE = Regex.Match(RNumber, @"^(R|L)\d{6}(COMP|CRIT|TEST|SU[1-9])(?=_)", RegexOptions.IgnoreCase); Match RNumberD = Regex.Match(RNumber, @"(?<=_)\d{3}[A-Z]\d{4}|\d{3}[A-Z]\d\w\w\d(?=_)", RegexOptions.IgnoreCase); Match RNumberDate = Regex.Match(RNumber, @"(?<=_)\d{8}(?=_)", RegexOptions.IgnoreCase); if (RNumberD.Value == @"") continue; if (RNumberE.Value == @"") continue; if (RNumberDate.Value == @"") continue; if (index2 != -1) continue; /* string RNumberE = RNumber.Split('_')[0]; string RNumberD = RNumber.Split('_')[1]; string RNumberDate = RNumber.Split('_')[2]; */ DateTime dateTime = DateTime.ParseExact(RNumberDate.Value, "yyyyMMdd", Thread.CurrentThread.CurrentCulture); string cmmDate = dateTime.ToString("dd-MMM-yyyy"); string[] lines = File.ReadAllLines(file); bool parse = false; foreach (string tmpLine in lines) { string line = tmpLine.Trim(); if (!parse && line.StartsWith("Feat. Type,")) { parse = true; continue; } if (!parse || string.IsNullOrEmpty(line)) { continue; } Console.WriteLine(tmpLine); foreach (SqlParameter parameter in insertCommand.Parameters) { parameter.Value = null; } string[] values = line.Split(new[] { ',' }); for (int i = 0; i < values.Length - 1; i++) { SqlParameter param = insertCommand.Parameters[i]; if (param.DbType == DbType.Decimal) { decimal value; param.Value = decimal.TryParse(values[i], out value) ? value : 0; } else { param.Value = values[i]; } } insertCommand.Parameters.Add(new SqlParameter("@PartNumber", RNumberE)); insertCommand.Parameters.Add(new SqlParameter("@CMMNumber", RNumberD)); insertCommand.Parameters.Add(new SqlParameter("@Date", cmmDate)); insertCommand.Parameters.Add(new SqlParameter("@FileName", FileNameExt)); insertCommand.ExecuteNonQuery(); insertCommand.Parameters.RemoveAt("@PartNumber"); insertCommand.Parameters.RemoveAt("@CMMNumber"); insertCommand.Parameters.RemoveAt("@Date"); insertCommand.Parameters.RemoveAt("@FileName"); } } } Console.WriteLine("CMM data successfully imported to SQL database..."); } con.Close(); } } } }

    Read the article

  • HELP!! Delayed-job: Rake aborted! Can't modify frozen hash

    - by pmneve
    Too bad even the trace doesn't say which hash is involved. Sorry this post is long: am trying to provide enough context to be meaningful. Occurs intermittently when rake jobs:work is pulling a command out of delayed_jobs while my status observer is in the process of parsing a log file for detailed results of the previous delayed_job denizen. I have an observer class (in RAILS_ROOT/lib ) which listens for the events, makes a copy of them and calls the owner class ( in apps/models ) which then calls on the log parser (also in /lib) to do the actual work. (Should both of those classes, the observer and the parser be in app/models?) Am due to deliver this application in a few days and this is killing it (and me). Am using DirectoryWatcher to look for flag files that indicate the start and finish of the delayed_jobs. That is started at the end of environment.rb like this: require 'directory_watcher' $scriptStatusObserver = ScriptStatusObserver.new dirToWatch ="#{RAILS_ROOT}/tmp/flags" $directoryWatcher = DirectoryWatcher.new( dirToWatch ) $directoryWatcher.glob= "*.flg" $directoryWatcher.interval=(15) $directoryWatcher.add_observer( $scriptStatusObserver ) $directoryWatcher.persist=("#{RAILS_ROOT}/tmp/flags/dw_state.yml") $directoryWatcher.start at_exit { $directoryWatcher.stop } This code is outside of the run method (btw is that the best place or is inside the run better?) Here is the observer: require 'script_run' class ScriptStatusObserver def initialize @rcvdEvents = [] end def update( *events ) begin puts "#{LINE.to_s}: ScriptStatusObserver events: \n"+events.to_yaml cnt = 0 events.each do |e| if e.to_s.match(/^\s*added/) cnt = cnt + 1 @rcvdEvents << e end end ScriptRun.new.catch_up( @rcvdEvents ) if cnt > 0 @rcvdEvents.clear rescue puts $! end end end Here is ScriptRun (it attaches to an associative table built with has_many:through) require 'observer' class ScriptRun < ActiveRecord::Base set_table_name "scripts_runs" belongs_to :script belongs_to :run def parse( result ) parser = LogParser.new parser.parse(result) end def catch_up( events ) events.each do |e| typ = e.type path = e.path thisMatch = path.match(/flags\/(\d+)_(\d+)_([\d\.]+)_(\w+)\.flg/) run_id = thisMatch[1] script_id = thisMatch[2] ts = thisMatch[3] status = thisMatch[4] if e.to_s.match(/^\s*added/) status_update( script_id, run_id, status, ts, path ) end end end def status_update( script_id, run_id, status, ts, path ) scriptrun = ScriptRun.find(:first, :conditions => [ "run_id = ? and script_id = ?", run_id.to_i, script_id.to_i ]) if scriptrun.kind_of?(ScriptRun) currStatus = scriptrun.status if not currStatus == 'completed' scriptrun.update_attribute(:status, status) if status == 'parse' flag = File.new(path) logSpec = flag.gets flag.close logName = File.basename(logSpec) logPath = logSpec.sub(logName, '') logName =~ /^(([\w_]+)_([\w]+)_(\d+))\.log$/ name = $1 basename = $2 runenv = $3 tsOrPid = $4 result = Result.new result.log_path = logPath result.basename = basename result.name = name result.script_id = script_id.to_i result.run_id = run_id.to_i if runenv == 'sit' runenv = 'SIT3348' end result.application_environment_id = ApplicationEnvironment.find(:first, :conditions => [ "nodename = ?", runenv]).id parse(result) if run_completed?( run_id ) myRun = Run.find(run_id.to_i) if myRun.kind_of?( Run ) myRun.update_attribute( :completed, Time.now.to_f ) end end end end else puts "#{__LINE__.to_s}: ScriptRun.status_update: ScriptRun not found for run #{run_id} script #{script_id} ts #{ts.to_s}" end File.delete(path) end def run_completed?( id ) scriptruns = ScriptRun.find(:all, :conditions = [ "run_id = ?", id.to_i] ) scriptruns.each do |sr| if not sr.status == 'completed' return false end end return true end end LogParser is too long even for this post but it reads the script log and pulls detailed information (counts and timings) out of the log and writes to a details table. It also tallies and calculates averages and rolls those up into summary tables for quicker access from the web pages. Here is the error trace: (don't ask why everything is under my Windows profile. It's a long story) Scanner running 1270239731.43 directory_watcher.notify_observers: #, #] update:[#, /pneve/workspace/waftt-0.29/tmp/flags/100039_18_1270239550.108_parse.flg"] rake aborted! can't modify frozen hash C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/attribute_methods.rb:313:in []=' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/attribute_methods.rb:313:inwrite_attribute_without_dirty' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/dirty.rb:139:in write_attribute' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/attribute_methods.rb:211:inlast_error=' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:141:in handle_failed_job' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:115:inrun' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:162:in reserve_and_run_one_job' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:92:inwork_off' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:91:in times' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:91:inwork_off' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:66:in start' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activesupport/ lib/active_support/core_ext/benchmark.rb:10:inrealtime' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:65:in start' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:62:inloop' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:62:in start' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/tasks.rb:13 c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:636:incall' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:636:in execute' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:631:ineach' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:631:in execute' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:597:ininvoke_with_call_chain' c:/Documents and Settings/pneve/ruby/lib/ruby/1.8/monitor.rb:242:in synchronize ' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:590:ininvoke_with_call_chain' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:583:in invoke' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2051:ininvoke_task' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2029:in top_level' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2029:ineach' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2029:in top_level' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2068:instandard_exception_handling' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2023:in top_level' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2001:inrun' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2068:in standard_exception_handling' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:1998:inrun' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/bin/rake: 31 c:/Documents and Settings/pneve/ruby/bin/rake:16:in `load' c:/Documents and Settings/pneve/ruby/bin/rake:16

    Read the article

  • HELP!! Rake aborted! Can't modify frozen hash

    - by pmneve
    Too bad even the trace doesn't say which hash is involved. Sorry this post is long: am trying to provide enough context to be meaningful. Occurs intermittently when rake jobs:work is pulling a command out of delayed_jobs while my status observer is in the process of parsing a log file for detailed results of the previous delayed_job denizen. I have an observer class (in RAILS_ROOT/lib ) which listens for the events, makes a copy of them and calls the owner class ( in apps/models ) which then calls on the log parser (also in /lib) to do the actual work. (Should both of those classes, the observer and the parser be in app/models?) Am due to deliver this application in a few days and this is killing it (and me). Am using DirectoryWatcher to look for flag files that indicate the start and finish of the delayed_jobs. That is started at the end of environment.rb like this: require 'directory_watcher' $scriptStatusObserver = ScriptStatusObserver.new dirToWatch ="#{RAILS_ROOT}/tmp/flags" $directoryWatcher = DirectoryWatcher.new( dirToWatch ) $directoryWatcher.glob= "*.flg" $directoryWatcher.interval=(15) $directoryWatcher.add_observer( $scriptStatusObserver ) $directoryWatcher.persist=("#{RAILS_ROOT}/tmp/flags/dw_state.yml") $directoryWatcher.start at_exit { $directoryWatcher.stop } This code is outside of the run method (btw is that the best place or is inside the run better?) Here is the observer: require 'script_run' class ScriptStatusObserver def initialize @rcvdEvents = [] end def update( *events ) begin puts "#{LINE.to_s}: ScriptStatusObserver events: \n"+events.to_yaml cnt = 0 events.each do |e| if e.to_s.match(/^\s*added/) cnt = cnt + 1 @rcvdEvents << e end end ScriptRun.new.catch_up( @rcvdEvents ) if cnt > 0 @rcvdEvents.clear rescue puts $! end end end Here is ScriptRun (it attaches to an associative table built with has_many:through) require 'observer' class ScriptRun < ActiveRecord::Base set_table_name "scripts_runs" belongs_to :script belongs_to :run def parse( result ) parser = LogParser.new parser.parse(result) end def catch_up( events ) events.each do |e| typ = e.type path = e.path thisMatch = path.match(/flags\/(\d+)_(\d+)_([\d\.]+)_(\w+)\.flg/) run_id = thisMatch[1] script_id = thisMatch[2] ts = thisMatch[3] status = thisMatch[4] if e.to_s.match(/^\s*added/) status_update( script_id, run_id, status, ts, path ) end end end def status_update( script_id, run_id, status, ts, path ) scriptrun = ScriptRun.find(:first, :conditions => [ "run_id = ? and script_id = ?", run_id.to_i, script_id.to_i ]) if scriptrun.kind_of?(ScriptRun) currStatus = scriptrun.status if not currStatus == 'completed' scriptrun.update_attribute(:status, status) if status == 'parse' flag = File.new(path) logSpec = flag.gets flag.close logName = File.basename(logSpec) logPath = logSpec.sub(logName, '') logName =~ /^(([\w_]+)_([\w]+)_(\d+))\.log$/ name = $1 basename = $2 runenv = $3 tsOrPid = $4 result = Result.new result.log_path = logPath result.basename = basename result.name = name result.script_id = script_id.to_i result.run_id = run_id.to_i if runenv == 'sit' runenv = 'SIT3348' end result.application_environment_id = ApplicationEnvironment.find(:first, :conditions => [ "nodename = ?", runenv]).id parse(result) if run_completed?( run_id ) myRun = Run.find(run_id.to_i) if myRun.kind_of?( Run ) myRun.update_attribute( :completed, Time.now.to_f ) end end end end else puts "#{__LINE__.to_s}: ScriptRun.status_update: ScriptRun not found for run #{run_id} script #{script_id} ts #{ts.to_s}" end File.delete(path) end def run_completed?( id ) scriptruns = ScriptRun.find(:all, :conditions = [ "run_id = ?", id.to_i] ) scriptruns.each do |sr| if not sr.status == 'completed' return false end end return true end end LogParser is too long even for this post but it reads the script log and pulls detailed information (counts and timings) out of the log and writes to a details table. It also tallies and calculates averages and rolls those up into summary tables for quicker access from the web pages. Here is the error trace: (don't ask why everything is under my Windows profile. It's a long story) Scanner running 1270239731.43 directory_watcher.notify_observers: #, #] update:[#, /pneve/workspace/waftt-0.29/tmp/flags/100039_18_1270239550.108_parse.flg"] rake aborted! can't modify frozen hash C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/attribute_methods.rb:313:in []=' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/attribute_methods.rb:313:inwrite_attribute_without_dirty' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/dirty.rb:139:in write_attribute' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activerecord/l ib/active_record/attribute_methods.rb:211:inlast_error=' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:141:in handle_failed_job' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:115:inrun' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:162:in reserve_and_run_one_job' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:92:inwork_off' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:91:in times' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:91:inwork_off' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:66:in start' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/rails/activesupport/ lib/active_support/core_ext/benchmark.rb:10:inrealtime' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:65:in start' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:62:inloop' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/worker.rb:62:in start' C:/Documents and Settings/pneve/workspace/waftt-0.29/vendor/plugins/delayed_job/ lib/delayed/tasks.rb:13 c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:636:incall' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:636:in execute' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:631:ineach' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:631:in execute' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:597:ininvoke_with_call_chain' c:/Documents and Settings/pneve/ruby/lib/ruby/1.8/monitor.rb:242:in synchronize ' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:590:ininvoke_with_call_chain' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:583:in invoke' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2051:ininvoke_task' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2029:in top_level' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2029:ineach' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2029:in top_level' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2068:instandard_exception_handling' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2023:in top_level' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2001:inrun' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:2068:in standard_exception_handling' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake. rb:1998:inrun' c:/Documents and Settings/pneve/ruby/lib/ruby/gems/1.8/gems/rake-0.8.7/bin/rake: 31 c:/Documents and Settings/pneve/ruby/bin/rake:16:in `load' c:/Documents and Settings/pneve/ruby/bin/rake:16

    Read the article

  • Problem with updating data in asp .NET MVC 2 application

    - by Bojan
    Hello everyone, i am just getting started with asp .NET MVC 2 applications and i stumbled upon a problem. I'm having trouble updating my tables. The debugger doesn't report any error, it just doesn't do anything... I hope some can help me out. Thank you for your time. This is my controller code... public ActionResult Edit(int id) { var supplierToEdit = (from c in _entities.SupplierSet where c.SupplierId == id select c).FirstOrDefault(); return View(supplierToEdit); } // // POST: /Supplier/Edit/5 [AcceptVerbs(HttpVerbs.Post)] public ActionResult Edit(Supplier supplierToEdit) { if (!ModelState.IsValid) return View(); try { var originalSupplier = (from c in _entities.SupplierSet where c.SupplierId == supplierToEdit.SupplierId select c).FirstOrDefault(); _entities.ApplyPropertyChanges(originalSupplier.EntityKey.EntitySetName, supplierToEdit); _entities.SaveChanges(); // TODO: Add update logic here return RedirectToAction("Index"); } catch { return View(); } } This is my View ... <h2>Edit</h2> <% using (Html.BeginForm()) {%> <%= Html.ValidationSummary(true) %> <fieldset> <legend>Fields</legend> <div class="editor-label"> <%= Html.LabelFor(model => model.CompanyName) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.CompanyName) %> <%= Html.ValidationMessageFor(model => model.CompanyName) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.ContactName) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.ContactName) %> <%= Html.ValidationMessageFor(model => model.ContactName) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.ContactTitle) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.ContactTitle) %> <%= Html.ValidationMessageFor(model => model.ContactTitle) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Address) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Address) %> <%= Html.ValidationMessageFor(model => model.Address) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.City) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.City) %> <%= Html.ValidationMessageFor(model => model.City) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.PostalCode) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.PostalCode) %> <%= Html.ValidationMessageFor(model => model.PostalCode) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Country) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Country) %> <%= Html.ValidationMessageFor(model => model.Country) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Telephone) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Telephone) %> <%= Html.ValidationMessageFor(model => model.Telephone) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.Fax) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.Fax) %> <%= Html.ValidationMessageFor(model => model.Fax) %> </div> <div class="editor-label"> <%= Html.LabelFor(model => model.HomePage) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(model => model.HomePage) %> <%= Html.ValidationMessageFor(model => model.HomePage) %> </div> <p> <input type="submit" value="Save" /> </p> </fieldset> <% } %> <div> <%= Html.ActionLink("Back to List", "Index") %> </div>

    Read the article

  • Authoritative sources about Database vs. Flatfile decision

    - by FastAl
    <tldr>looking for a reference to a book or other undeniably authoritative source that gives reasons when you should choose a database vs. when you should choose other storage methods. I have provided an un-authoritative list of reasons about 2/3 of the way down this post.</tldr> I have a situation at my company where a database is being used where it would be better to use another solution (in this case, an auto-generated piece of source code that contains a static lookup table, searched by binary sort). Normally, a database would be an OK solution even though the problem does not require a database, e.g, none of the elements of ACID are needed, as it is read-only data, updated about every 3-5 years (also requiring other sourcecode changes), and fits in memory, and can be keyed into via binary search (a tad faster than db, but speed is not an issue). The problem is that this code runs on our enterprise server, but is shared with several PC platforms (some disconnected, some use a central DB, etc.), and parts of it are managed by multiple programming units, parts by the DBAs, parts even by mathematicians in another department, etc. These hit their own platform’s version of their databases (containing their own copy of the static data). What happens is that every implementation, every little change, something different goes wrong. There are many other issues as well. I can’t even use a flatfile, because one mode of running on our enterprise server does not have permission to read files (only databases, and of course, its own literal storage, e.g., in-source table). Of course, other parts of the system use databases in proper, less obscure manners; there is no problem with those parts. So why don’t we just change it? I don’t have administrative ability to force a change. But I’m affected because sometimes I have to help fix the problems, but mostly because it causes outages and tons of extra IT time by other programmers and d*mmit that makes me mad! The reason neither management, nor the designers of the system, can see the problem is that they propose a solution that won’t work: increase communication; implement more safeguards and standards; etc. But every time, in a different part of the already-pared-down but still multi-step processes, a few different diligent, hard-working, top performing IT personnel make a unique subtle error that causes it to fail, sometimes after the last round of testing! And in general these are not single-person failures, but understandable miscommunications. And communication at our company is actually better than most. People just don't think that's the case because they haven't dug into the matter. However, I have it on very good word from somebody with extensive formal study of sociology and psychology that the relatively small amount of less-than-proper database usage in this gigantic cross-platform multi-source, multi-language project is bureaucratically un-maintainable. Impossible. No chance. At least with Human Beings in the loop, and it can’t be automated. In addition, the management and developers who could change this, though intelligent and capable, don’t understand the rigidity of this ‘how humans are’ issue, and are not convincible on the matter. The reason putting the static data in sourcecode will solve the problem is, although the solution is less sexy than a database, it would function with no technical drawbacks; and since the sharing of sourcecode already works very well, you basically erase any database-related effort from this section of the project, along with all the drawbacks of it that are causing problems. OK, that’s the background, for the curious. I won’t be able to convince management that this is an unfixable sociological problem, and that the real solution is coding around these limits of human nature, just as you would code around a bug in a 3rd party component that you can’t change. So what I have to do is exploit the unsuitableness of the database solution, and not do it using logic, but rather authority. I am aware of many reasons, and posts on this site giving reasons for one over the other; I’m not looking for lists of reasons like these (although you can add a comment if I've miss a doozy): WHY USE A DATABASE? instead of flatfile/other DB vs. file: if you need... Random Read / Transparent search optimization Advanced / varied / customizable Searching and sorting capabilities Transaction/rollback Locks, semaphores Concurrency control / Shared users Security 1-many/m-m is easier Easy modification Scalability Load Balancing Random updates / inserts / deletes Advanced query Administrative control of design, etc. SQL / learning curve Debugging / Logging Centralized / Live Backup capabilities Cached queries / dvlp & cache execution plans Interleaved update/read Referential integrity, avoid redundant/missing/corrupt/out-of-sync data Reporting (from on olap or oltp db) / turnkey generation tools [Disadvantages:] Important to get right the first time - professional design - but only b/c it's meant to last s/w & h/w cost Usu. over a network, speed issue (best vs. best design vs. local=even then a separate process req's marshalling/netwk layers/inter-p comm) indicies and query processing can stand in the way of simple processing (vs. flatfile) WHY USE FLATFILE: If you only need... Sequential Row processing only Limited usage append only (no reading, no master key/update) Only Update the record you're reading (fixed length recs only) Too big to fit into memory If Local disk / read-ahead network connection Portability / small system Email / cut & Paste / store as document by novice - simple format Low design learning curve but high cost later WHY USE IN-MEMORY/TABLE (tables, arrays, etc.): if you need... Processing a single db/ff record that was imported Known size of data Static data if hardcoding the table Narrow, unchanging use (e.g., one program or proc) -includes a class that will be shared, but encapsulates its data manipulation Extreme speed needed / high transaction frequency Random access - but search is dependent on implementation Following are some other posts about the topic: http://stackoverflow.com/questions/1499239/database-vs-flat-text-file-what-are-some-technical-reasons-for-choosing-one-over http://stackoverflow.com/questions/332825/are-flat-file-databases-any-good http://stackoverflow.com/questions/2356851/database-vs-flat-files http://stackoverflow.com/questions/514455/databases-vs-plain-text/514530 What I’d like to know is if anybody could recommend a hard, authoritative source containing these reasons. I’m looking for a paper book I can buy, or a reputable website with whitepapers about the issue (e.g., Microsoft, IBM), not counting the user-generated content on those sites. This will have a greater change to elicit a change that I’m looking for: less wasted programmer time, and more reliable programs. Thanks very much for your help. You win a prize for reading such a large post!

    Read the article

  • IOException: Unable To Delete Images Due To File Lock

    - by Arslan Pervaiz
    I am Unable To Delete Image File From My Server Path It Gaves Error That The Process Cannot Access The File "FileName" Because it is being Used By Another Process. I Tried Many Methods But Still All In Vain. Please Help me Out in This Issue. Here is My Code Snippet. using System; using System.Data; using System.Web; using System.Data.SqlClient; using System.Web.UI; using System.Web.UI.HtmlControls; using System.Globalization; using System.Web.Security; using System.Text; using System.DirectoryServices; using System.Collections; using System.IO; using System.Drawing; using System.Drawing.Imaging; using System.Drawing.Drawing2D; //============ Main Block ================= byte[] data = (byte[])ds.Tables[0].Rows[0][0]; MemoryStream ms = new MemoryStream(data); Image returnImage = Image.FromStream(ms); returnImage.Save(Server.MapPath(".\\TmpImages\\SavedImage.jpg"), System.Drawing.Imaging.ImageFormat.Jpeg); returnImage.Dispose(); \\ I Tried this Dispose Method To Unlock The File But Nothing Done. ms.Close(); \\ I Tried The Memory Stream Close Method Also But Its Also Not Worked For Me. watermark(); \\ Here is My Water Mark Method That Print Water Mark Image on My Saved Image (Image That is Converted From Byte Array) DeleteImages(); \\ Here is My Delete Method That I Call To Delete The Images //===== ==== My Delete Method To Delete Files================== public void DeleteImages() { try { File.Delete(Server.MapPath(".\\TmpImages\\WaterMark.jpg")); \\This Image Deleted Fine. File.Delete(Server.MapPath(".\\TmpImages\\SavedImage.jpg")); \\ Exception Thrown On Deleting of This Image. } catch (Exception ex) { LogManager.LogException(ex, "Error in Deleting Images."); Master.ShowMessage(ex.Message, true); } } \ ==== Method Declartion That Make Watermark of One Image On Another Image.======= public void watermark() { //create a image object containing the photograph to watermark Image imgPhoto = Image.FromFile(Server.MapPath(".\\TmpImages\\SavedImage.jpg")); int phWidth = imgPhoto.Width; int phHeight = imgPhoto.Height; //create a Bitmap the Size of the original photograph Bitmap bmPhoto = new Bitmap(phWidth, phHeight, PixelFormat.Format24bppRgb); bmPhoto.SetResolution(imgPhoto.HorizontalResolution, imgPhoto.VerticalResolution); //load the Bitmap into a Graphics object Graphics grPhoto = Graphics.FromImage(bmPhoto); //create a image object containing the watermark Image imgWatermark = new Bitmap(Server.MapPath(".\\TmpImages\\PrintasWatermark.jpg")); int wmWidth = imgWatermark.Width; int wmHeight = imgWatermark.Height; //Set the rendering quality for this Graphics object grPhoto.SmoothingMode = SmoothingMode.AntiAlias; //Draws the photo Image object at original size to the graphics object. grPhoto.DrawImage( imgPhoto, // Photo Image object new Rectangle(0, 0, phWidth, phHeight), // Rectangle structure 0, // x-coordinate of the portion of the source image to draw. 0, // y-coordinate of the portion of the source image to draw. phWidth, // Width of the portion of the source image to draw. phHeight, // Height of the portion of the source image to draw. GraphicsUnit.Pixel); // Units of measure //------------------------------------------------------- //to maximize the size of the Copyright message we will //test multiple Font sizes to determine the largest posible //font we can use for the width of the Photograph //define an array of point sizes you would like to consider as possiblities //------------------------------------------------------- //Define the text layout by setting the text alignment to centered StringFormat StrFormat = new StringFormat(); StrFormat.Alignment = StringAlignment.Center; //define a Brush which is semi trasparent black (Alpha set to 153) SolidBrush semiTransBrush2 = new SolidBrush(Color.FromArgb(153, 0, 0, 0)); //define a Brush which is semi trasparent white (Alpha set to 153) SolidBrush semiTransBrush = new SolidBrush(Color.FromArgb(153, 255, 255, 255)); //------------------------------------------------------------ //Step #2 - Insert Watermark image //------------------------------------------------------------ //Create a Bitmap based on the previously modified photograph Bitmap Bitmap bmWatermark = new Bitmap(bmPhoto); bmWatermark.SetResolution(imgPhoto.HorizontalResolution, imgPhoto.VerticalResolution); //Load this Bitmap into a new Graphic Object Graphics grWatermark = Graphics.FromImage(bmWatermark); //To achieve a transulcent watermark we will apply (2) color //manipulations by defineing a ImageAttributes object and //seting (2) of its properties. ImageAttributes imageAttributes = new ImageAttributes(); //The first step in manipulating the watermark image is to replace //the background color with one that is trasparent (Alpha=0, R=0, G=0, B=0) //to do this we will use a Colormap and use this to define a RemapTable ColorMap colorMap = new ColorMap(); //My watermark was defined with a background of 100% Green this will //be the color we search for and replace with transparency colorMap.OldColor = Color.FromArgb(255, 0, 255, 0); colorMap.NewColor = Color.FromArgb(0, 0, 0, 0); ColorMap[] remapTable = { colorMap }; imageAttributes.SetRemapTable(remapTable, ColorAdjustType.Bitmap); //The second color manipulation is used to change the opacity of the //watermark. This is done by applying a 5x5 matrix that contains the //coordinates for the RGBA space. By setting the 3rd row and 3rd column //to 0.3f we achive a level of opacity float[][] colorMatrixElements = { new float[] {1.0f, 0.0f, 0.0f, 0.0f, 0.0f}, new float[] {0.0f, 1.0f, 0.0f, 0.0f, 0.0f}, new float[] {0.0f, 0.0f, 1.0f, 0.0f, 0.0f}, new float[] {0.0f, 0.0f, 0.0f, 0.3f, 0.0f}, new float[] {0.0f, 0.0f, 0.0f, 0.0f, 1.0f}}; ColorMatrix wmColorMatrix = new ColorMatrix(colorMatrixElements); imageAttributes.SetColorMatrix(wmColorMatrix, ColorMatrixFlag.Default, ColorAdjustType.Bitmap); //For this example we will place the watermark in the upper right //hand corner of the photograph. offset down 10 pixels and to the //left 10 pixles int xPosOfWm = ((phWidth - wmWidth) - 10); int yPosOfWm = 10; grWatermark.DrawImage(imgWatermark, new Rectangle(xPosOfWm, yPosOfWm, wmWidth, wmHeight), //Set the detination Position 0, // x-coordinate of the portion of the source image to draw. 0, // y-coordinate of the portion of the source image to draw. wmWidth, // Watermark Width wmHeight, // Watermark Height GraphicsUnit.Pixel, // Unit of measurment imageAttributes); //ImageAttributes Object //Replace the original photgraphs bitmap with the new Bitmap imgPhoto = bmWatermark; grPhoto.Dispose(); grWatermark.Dispose(); //save new image to file system. imgPhoto.Save(Server.MapPath(".\\TmpImages\\WaterMark.jpg"), ImageFormat.Jpeg); imgPhoto.Dispose(); imgWatermark.Dispose(); }

    Read the article

  • How to display table in ASP.NET website?

    - by salvationishere
    I am developing a C# VS 2008 / SQL Server 2008 website, but now I cannot display the Gridview containing my table. I included the Default.aspx and aspx.cs files below. What do I need to do to fix this? I am not getting any errors now; just that this grid does not show up. Thanks! ASPX file: <%@ Page Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true" CodeFile="Default.aspx.cs" Inherits="_Default" Title="Untitled Page" %> <asp:Content ID="Content1" ContentPlaceHolderID="MainContent" Runat="Server"> <asp:Panel runat="server" ID="AuthenticatedMessagePanel"> <asp:Label runat="server" ID="WelcomeBackMessage"></asp:Label> <table> <tr > <td> <asp:Label ID="tableLabel" runat="server" Font-Bold="True" Text="Select target table:"></asp:Label> </td> <td> <asp:Label ID="inputLabel" runat="server" Font-Bold="True" Text="Select input file:"></asp:Label> </td></tr> <tr><td valign="top"> <asp:Label ID="feedbackLabel" runat="server"></asp:Label> <asp:SqlDataSource ID="SelectTables" runat="server" ConnectionString="<%$ ConnectionStrings:AdventureWorks3_SelectTables %>" SelectCommand="SELECT TABLE_NAME FROM information_schema.Tables WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_SCHEMA = @SchemaName"> <SelectParameters> <asp:Parameter Name="SchemaName" Type="String" DefaultValue="" /> </SelectParameters> <InsertParameters> <asp:Parameter Name="TABLE_NAME" Direction="Output" Type="String" DefaultValue="" /> </InsertParameters> </asp:SqlDataSource> <asp:GridView ID="GridView1" DatasourceID="SelectTables" runat="server" style="WIDTH: 400px;" CellPadding="4" ForeColor="#333333" GridLines="None" onselectedindexchanged="GridView1_SelectedIndexChanged" AutoGenerateSelectButton="True" DataKeyNames="TABLE_NAME"> <RowStyle BackColor="#F7F6F3" ForeColor="#333333" /> <Columns> <asp:BoundField HeaderText="TABLE_NAME" DataField="TABLE_NAME" /> </Columns> <FooterStyle BackColor="#5D7B9D" Font-Bold="True" ForeColor="White" /> <PagerStyle BackColor="#284775" ForeColor="White" HorizontalAlign="Center" /> <SelectedRowStyle BackColor="#E2DED6" Font-Bold="True" ForeColor="#333333" /> <HeaderStyle BackColor="#5D7B9D" Font-Bold="True" ForeColor="White" /> <EditRowStyle BackColor="#999999" /> <AlternatingRowStyle BackColor="White" ForeColor="#284775" /> </asp:GridView> </td> <td valign="top"> <input id="uploadFile" type="file" size="26" runat="server" name="uploadFile" title="UploadFile" class="greybar" enableviewstate="True" /> </td></tr> </table> </asp:Panel> <asp:Panel runat="Server" ID="AnonymousMessagePanel"> <asp:HyperLink runat="server" ID="lnkLogin" Text="Log In" NavigateUrl="~/Login.aspx"> </asp:HyperLink> </asp:Panel> </asp:Content> ASPX.CS file: using System; using System.Collections; using System.Configuration; using System.Data; using System.Linq; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.HtmlControls; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Xml.Linq; using System.Collections.Generic; using System.IO; using System.Drawing; using System.ComponentModel; using System.Data.SqlClient; using ADONET_namespace; using System.Security.Principal; //using System.Windows; public partial class _Default : System.Web.UI.Page //namespace AddFileToSQL { //protected System.Web.UI.HtmlControls.HtmlInputFile uploadFile; protected System.Web.UI.HtmlControls.HtmlInputButton btnOWrite; protected System.Web.UI.HtmlControls.HtmlInputButton btnAppend; protected System.Web.UI.WebControls.Label Label1; protected static string inputfile = ""; public static string targettable; public static string selection; // Number of controls added to view state protected int default_NumberOfControls { get { if (ViewState["default_NumberOfControls"] != null) { return (int)ViewState["default_NumberOfControls"]; } else { return 0; } } set { ViewState["default_NumberOfControls"] = value; } } protected void uploadFile_onclick(object sender, EventArgs e) { } protected void Load_GridData() { //GridView1.DataSource = ADONET_methods.DisplaySchemaTables(); //GridView1.DataBind(); } protected void btnOWrite_Click(object sender, EventArgs e) { if (uploadFile.PostedFile.ContentLength > 0) { feedbackLabel.Text = "You do not have sufficient access to overwrite table records."; } else { feedbackLabel.Text = "This file does not contain any data."; } } protected void btnAppend_Click(object sender, EventArgs e) { string fullpath = Page.Request.PhysicalApplicationPath; string path = uploadFile.PostedFile.FileName; if (File.Exists(path)) { // Create a file to write to. try { StreamReader sr = new StreamReader(path); string s = ""; while (sr.Peek() > 0) s = sr.ReadLine(); sr.Close(); } catch (IOException exc) { Console.WriteLine(exc.Message + "Cannot open file."); return; } } if (uploadFile.PostedFile.ContentLength > 0) { inputfile = System.IO.File.ReadAllText(path); Session["Message"] = inputfile; Response.Redirect("DataMatch.aspx"); } else { feedbackLabel.Text = "This file does not contain any data."; } } protected void Page_Load(object sender, EventArgs e) { if (Request.IsAuthenticated) { WelcomeBackMessage.Text = "Welcome back, " + User.Identity.Name + "!"; // Reference the CustomPrincipal / CustomIdentity CustomIdentity ident = User.Identity as CustomIdentity; if (ident != null) WelcomeBackMessage.Text += string.Format(" You are the {0} of {1}.", ident.Title, ident.CompanyName); AuthenticatedMessagePanel.Visible = true; AnonymousMessagePanel.Visible = false; if (!Page.IsPostBack) { Load_GridData(); } } else { AuthenticatedMessagePanel.Visible = false; AnonymousMessagePanel.Visible = true; } } protected void GridView1_SelectedIndexChanged(object sender, EventArgs e) { GridViewRow row = GridView1.SelectedRow; targettable = row.Cells[2].Text; } }

    Read the article

  • How do I prove I should put a table of values in source code instead of a database table?

    - by FastAl
    <tldr>looking for a reference to a book or other undeniably authoritative source that gives reasons when you should choose a database vs. when you should choose other storage methods. I have provided an un-authoritative list of reasons about 2/3 of the way down this post.</tldr> I have a situation at my company where a database is being used where it would be better to use another solution (in this case, an auto-generated piece of source code that contains a static lookup table, searched by binary sort). Normally, a database would be an OK solution even though the problem does not require a database, e.g, none of the elements of ACID are needed, as it is read-only data, updated about every 3-5 years (also requiring other sourcecode changes), and fits in memory, and can be keyed into via binary search (a tad faster than db, but speed is not an issue). The problem is that this code runs on our enterprise server, but is shared with several PC platforms (some disconnected, some use a central DB, etc.), and parts of it are managed by multiple programming units, parts by the DBAs, parts even by mathematicians in another department, etc. These hit their own platform’s version of their databases (containing their own copy of the static data). What happens is that every implementation, every little change, something different goes wrong. There are many other issues as well. I can’t even use a flatfile, because one mode of running on our enterprise server does not have permission to read files (only databases, and of course, its own literal storage, e.g., in-source table). Of course, other parts of the system use databases in proper, less obscure manners; there is no problem with those parts. So why don’t we just change it? I don’t have administrative ability to force a change. But I’m affected because sometimes I have to help fix the problems, but mostly because it causes outages and tons of extra IT time by other programmers and d*mmit that makes me mad! The reason neither management, nor the designers of the system, can see the problem is that they propose a solution that won’t work: increase communication; implement more safeguards and standards; etc. But every time, in a different part of the already-pared-down but still multi-step processes, a few different diligent, hard-working, top performing IT personnel make a unique subtle error that causes it to fail, sometimes after the last round of testing! And in general these are not single-person failures, but understandable miscommunications. And communication at our company is actually better than most. People just don't think that's the case because they haven't dug into the matter. However, I have it on very good word from somebody with extensive formal study of sociology and psychology that the relatively small amount of less-than-proper database usage in this gigantic cross-platform multi-source, multi-language project is bureaucratically un-maintainable. Impossible. No chance. At least with Human Beings in the loop, and it can’t be automated. In addition, the management and developers who could change this, though intelligent and capable, don’t understand the rigidity of this ‘how humans are’ issue, and are not convincible on the matter. The reason putting the static data in sourcecode will solve the problem is, although the solution is less sexy than a database, it would function with no technical drawbacks; and since the sharing of sourcecode already works very well, you basically erase any database-related effort from this section of the project, along with all the drawbacks of it that are causing problems. OK, that’s the background, for the curious. I won’t be able to convince management that this is an unfixable sociological problem, and that the real solution is coding around these limits of human nature, just as you would code around a bug in a 3rd party component that you can’t change. So what I have to do is exploit the unsuitableness of the database solution, and not do it using logic, but rather authority. I am aware of many reasons, and posts on this site giving reasons for one over the other; I’m not looking for lists of reasons like these (although you can add a comment if I've miss a doozy): WHY USE A DATABASE? instead of flatfile/other DB vs. file: if you need... Random Read / Transparent search optimization Advanced / varied / customizable Searching and sorting capabilities Transaction/rollback Locks, semaphores Concurrency control / Shared users Security 1-many/m-m is easier Easy modification Scalability Load Balancing Random updates / inserts / deletes Advanced query Administrative control of design, etc. SQL / learning curve Debugging / Logging Centralized / Live Backup capabilities Cached queries / dvlp & cache execution plans Interleaved update/read Referential integrity, avoid redundant/missing/corrupt/out-of-sync data Reporting (from on olap or oltp db) / turnkey generation tools [Disadvantages:] Important to get right the first time - professional design - but only b/c it's meant to last s/w & h/w cost Usu. over a network, speed issue (best vs. best design vs. local=even then a separate process req's marshalling/netwk layers/inter-p comm) indicies and query processing can stand in the way of simple processing (vs. flatfile) WHY USE FLATFILE: If you only need... Sequential Row processing only Limited usage append only (no reading, no master key/update) Only Update the record you're reading (fixed length recs only) Too big to fit into memory If Local disk / read-ahead network connection Portability / small system Email / cut & Paste / store as document by novice - simple format Low design learning curve but high cost later WHY USE IN-MEMORY/TABLE (tables, arrays, etc.): if you need... Processing a single db/ff record that was imported Known size of data Static data if hardcoding the table Narrow, unchanging use (e.g., one program or proc) -includes a class that will be shared, but encapsulates its data manipulation Extreme speed needed / high transaction frequency Random access - but search is dependent on implementation Following are some other posts about the topic: http://stackoverflow.com/questions/1499239/database-vs-flat-text-file-what-are-some-technical-reasons-for-choosing-one-over http://stackoverflow.com/questions/332825/are-flat-file-databases-any-good http://stackoverflow.com/questions/2356851/database-vs-flat-files http://stackoverflow.com/questions/514455/databases-vs-plain-text/514530 What I’d like to know is if anybody could recommend a hard, authoritative source containing these reasons. I’m looking for a paper book I can buy, or a reputable website with whitepapers about the issue (e.g., Microsoft, IBM), not counting the user-generated content on those sites. This will have a greater change to elicit a change that I’m looking for: less wasted programmer time, and more reliable programs. Thanks very much for your help. You win a prize for reading such a large post!

    Read the article

  • Jquery Session & Table Filtering

    - by Bry4n
    This is my Jquery <script type="text/javascript"> $(function() { var from = $.session("from"); var to = $.session("to"); var $th = $('#theTable').find('th'); // had to add the classes here to not grab the "td" inside those tables var $td = $('#theTable').find('td.bluedata,td.yellowdata'); $th.hide(); $td.hide(); if (to == "Select" || from == "Select") { // shortcut - nothing set, show everything $th.add($td).show(); return; } var filterArray = new Array(); filterArray[0] = to; filterArray[1] = from; $.each(filterArray, function(i){ if (filterArray[i].toString() == "Select") { filterArray[i] = ""; } }); $($th).each(function(){ if ($( this,":eq(0):contains('" + filterArray[0].toString() + "')") != null && $(this,":eq(1):contains('" + filterArray[1].toString() + "')") != null) { $(this).show(); } }); $($td).each(function(){ if ($( this,":eq(0):contains('" + filterArray[0].toString() + "')") != null && $(this,":eq(1):contains('" + filterArray[1].toString() + "')") != null) { $(this).show(); } }); }); </script> This is my table <table border="1" id="theTable"> <tr class="headers"> <th class="bluedata"height="20px" valign="top">63rd St. &amp; Malvern Av. Loop<BR/></th> <th class="yellowdata"height="20px" valign="top">52nd St. &amp; Lansdowne Av.<BR/></th> <th class="bluedata"height="20px" valign="top">Lancaster &amp; Girard Avs<BR/></th> <th class="yellowdata"height="20px" valign="top">40th St. &amp; Lancaster Av.<BR/></th> <th class="bluedata"height="20px" valign="top">36th &amp; Market Sts<BR/></th> <th class="bluedata"height="20px" valign="top">6th &amp; Market Sts<BR/></th> <th class="yellowdata"height="20px" valign="top">Juniper Station<BR/></th> </tr> <tr> <td class="bluedata"height="20px" title="63rd St. &amp; Malvern Av. Loop"> <table width="100%"><tr><td>12:17am</td></tr><tr><td>12:17am</td></tr><tr><td>12:47am</td></tr></table> </td> <td class="yellowdata"height="20px" title="52nd St. &amp; Lansdowne Av."> <table width="100%"><tr><td>12:17am</td></tr><tr><td>12:17am</td></tr><tr><td>12:47am</td></tr></table> </td> <td class="bluedata"height="20px" title="Lancaster &amp; Girard Avs"> <table width="100%"><tr><td>12:17am</td></tr><tr><td>12:17am</td></tr><tr><td>12:47am</td></tr></table> </td> <td class="yellowdata"height="20px" title="40th St. &amp; Lancaster Av."> <table width="100%"><tr><td>12:17am</td></tr><tr><td>12:17am</td></tr><tr><td>12:47am</td></tr></table> </td> <td class="bluedata"height="20px" title="36th &amp; Market Sts"> <table width="100%"><tr><td>12:17am</td></tr><tr><td>12:17am</td></tr><tr><td>12:47am</td></tr></table> </td> <td class="bluedata"height="20px" title="6th &amp; Market Sts"> <table width="100%"><tr><td>12:17am</td></tr><tr><td>12:17am</td></tr><tr><td>12:47am</td></tr></table> </td> <td class="bluedata"height="20px" title="Juniper Station"> <table width="100%"><tr><td>12:17am</td></tr><tr><td>12:17am</td></tr><tr><td>12:47am</td></tr></table> </td> </tr> </table> I have asked questions on here before and I have had success in converting textbox values to dropdown changes. However this is a bit different. I am using the sessions plugin (which works fine). On one page I have a set of normal drop downs, on submit you get taken to a separate page which runs the function above, however the rows/columns all show and they don't seem to filter at all.

    Read the article

  • Deadlock in SQL Server 2005! Two real-time bulk upserts are fighting. WHY?

    - by skimania
    Here's the scenario: I've got a table called MarketDataCurrent (MDC) that has live updating stock prices. I've got one process called 'LiveFeed' which reads prices streaming from the wire, queues up inserts, and uses a 'bulk upload to temp table then insert/update to MDC table.' (BulkUpsert) I've got another process which then reads this data, computes other data, and then saves the results back into the same table, using a similar BulkUpsert stored proc. Thirdly, there are a multitude of users running a C# Gui polling the MDC table and reading updates from it. Now, during the day when the data is changing rapidly, things run pretty smoothly, but then, after market hours, we've recently started seeing an increasing number of Deadlock exceptions coming out of the database, nowadays we see 10-20 a day. The imporant thing to note here is that these happen when the values are NOT changing. Here's all the relevant info: Table Def: CREATE TABLE [dbo].[MarketDataCurrent]( [MDID] [int] NOT NULL, [LastUpdate] [datetime] NOT NULL, [Value] [float] NOT NULL, [Source] [varchar](20) NULL, CONSTRAINT [PK_MarketDataCurrent] PRIMARY KEY CLUSTERED ( [MDID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] - I've got a Sql Profiler Trace Running, catching the deadlocks, and here's what all the graphs look like. Process 258 is called the following 'BulkUpsert' stored proc, repeatedly, while 73 is calling the next one: ALTER proc [dbo].[MarketDataCurrent_BulkUpload] @updateTime datetime, @source varchar(10) as begin transaction update c with (rowlock) set LastUpdate = getdate(), Value = t.Value, Source = @source from MarketDataCurrent c INNER JOIN #MDTUP t ON c.MDID = t.mdid where c.lastUpdate < @updateTime and c.mdid not in (select mdid from MarketData where LiveFeedTicker is not null and PriceSource like 'LiveFeed.%') and c.value <> t.value insert into MarketDataCurrent with (rowlock) select MDID, getdate(), Value, @source from #MDTUP where mdid not in (select mdid from MarketDataCurrent with (nolock)) and mdid not in (select mdid from MarketData where LiveFeedTicker is not null and PriceSource like 'LiveFeed.%') commit And the other one: ALTER PROCEDURE [dbo].[MarketDataCurrent_LiveFeedUpload] AS begin transaction -- Update existing mdid UPDATE c WITH (ROWLOCK) SET LastUpdate = t.LastUpdate, Value = t.Value, Source = t.Source FROM MarketDataCurrent c INNER JOIN #TEMPTABLE2 t ON c.MDID = t.mdid; -- Insert new MDID INSERT INTO MarketDataCurrent with (ROWLOCK) SELECT * FROM #TEMPTABLE2 WHERE MDID NOT IN (SELECT MDID FROM MarketDataCurrent with (NOLOCK)) -- Clean up the temp table DELETE #TEMPTABLE2 commit To clarify, those Temp Tables are being created by the C# code on the same connection and are populated using the C# SqlBulkCopy class. To me it looks like it's deadlocking on the PK of the table, so I tried removing that PK and switching to a Unique Constraint instead but that increased the number of deadlocks 10-fold. I'm totally lost as to what to do about this situation and am open to just about any suggestion. HELP!! In response to the request for the XDL, here it is: <deadlock-list> <deadlock victim="processc19978"> <process-list> <process id="processaf0b68" taskpriority="0" logused="0" waitresource="KEY: 6:72057594090487808 (d900ed5a6cc6)" waittime="718" ownerId="1102128174" transactionname="user_transaction" lasttranstarted="2010-06-11T16:30:44.750" XDES="0xffffffff817f9a40" lockMode="U" schedulerid="3" kpid="8228" status="suspended" spid="73" sbid="0" ecid="0" priority="0" transcount="2" lastbatchstarted="2010-06-11T16:30:44.750" lastbatchcompleted="2010-06-11T16:30:44.750" clientapp=".Net SqlClient Data Provider" hostname="RISKAPPS_VM" hostpid="3836" loginname="RiskOpt" isolationlevel="read committed (2)" xactid="1102128174" currentdb="6" lockTimeout="4294967295" clientoption1="671088672" clientoption2="128056"> <executionStack> <frame procname="MKP_RISKDB.dbo.MarketDataCurrent_BulkUpload" line="28" stmtstart="1062" stmtend="1720" sqlhandle="0x03000600a28e5e4ef4fd8e00849d00000100000000000000"> UPDATE c WITH (ROWLOCK) SET LastUpdate = getdate(), Value = t.Value, Source = @source FROM MarketDataCurrent c INNER JOIN #MDTUP t ON c.MDID = t.mdid WHERE c.lastUpdate &lt; @updateTime and c.mdid not in (select mdid from MarketData where BloombergTicker is not null and PriceSource like &apos;Blbg.%&apos;) and c.value &lt;&gt; t.value </frame> <frame procname="adhoc" line="1" stmtstart="88" sqlhandle="0x01000600c1653d0598706ca7000000000000000000000000"> exec MarketDataCurrent_BulkUpload @clearBefore, @source </frame> <frame procname="unknown" line="1" sqlhandle="0x000000000000000000000000000000000000000000000000"> unknown </frame> </executionStack> <inputbuf> (@clearBefore datetime,@source nvarchar(10))exec MarketDataCurrent_BulkUpload @clearBefore, @source </inputbuf> </process> <process id="processc19978" taskpriority="0" logused="0" waitresource="KEY: 6:72057594090487808 (74008e31572b)" waittime="718" ownerId="1102128228" transactionname="user_transaction" lasttranstarted="2010-06-11T16:30:44.780" XDES="0x380be9d8" lockMode="U" schedulerid="5" kpid="8464" status="suspended" spid="248" sbid="0" ecid="0" priority="0" transcount="2" lastbatchstarted="2010-06-11T16:30:44.780" lastbatchcompleted="2010-06-11T16:30:44.780" clientapp=".Net SqlClient Data Provider" hostname="RISKBBG_VM" hostpid="4480" loginname="RiskOpt" isolationlevel="read committed (2)" xactid="1102128228" currentdb="6" lockTimeout="4294967295" clientoption1="671088672" clientoption2="128056"> <executionStack> <frame procname="MKP_RISKDB.dbo.MarketDataCurrentBlbgRtUpload" line="14" stmtstart="840" stmtend="1220" sqlhandle="0x03000600005f9d24c8878f00849d00000100000000000000"> UPDATE c WITH (ROWLOCK) SET LastUpdate = t.LastUpdate, Value = t.Value, Source = t.Source FROM MarketDataCurrent c INNER JOIN #TEMPTABLE2 t ON c.MDID = t.mdid; -- Insert new MDID </frame> <frame procname="adhoc" line="1" sqlhandle="0x010006004a58132228bf8d73000000000000000000000000"> MarketDataCurrentBlbgRtUpload </frame> </executionStack> <inputbuf> MarketDataCurrentBlbgRtUpload </inputbuf> </process> </process-list> <resource-list> <keylock hobtid="72057594090487808" dbid="6" objectname="MKP_RISKDB.dbo.MarketDataCurrent" indexname="PK_MarketDataCurrent" id="lock5ba77b00" mode="U" associatedObjectId="72057594090487808"> <owner-list> <owner id="processc19978" mode="U"/> </owner-list> <waiter-list> <waiter id="processaf0b68" mode="U" requestType="wait"/> </waiter-list> </keylock> <keylock hobtid="72057594090487808" dbid="6" objectname="MKP_RISKDB.dbo.MarketDataCurrent" indexname="PK_MarketDataCurrent" id="lock65dca340" mode="U" associatedObjectId="72057594090487808"> <owner-list> <owner id="processaf0b68" mode="U"/> </owner-list> <waiter-list> <waiter id="processc19978" mode="U" requestType="wait"/> </waiter-list> </keylock> </resource-list> </deadlock> </deadlock-list>

    Read the article

  • SQL error - Cannot convert nvarchar to decimal

    - by jakesankey
    I have a C# application that simply parses all of the txt documents within a given network directory and imports the data to a SQL server db. Everything was cruising along just fine until about the 1800th file when it happend to have a few blanks in columns that are called out as DBType.Decimal (and the value is usually zero in the files, not blank). So I got this error, "cannot convert nvarchar to decimal". I am wondering how I could tell the app to simply skip the lines that have this issue?? Perhaps I could even just change the column type to varchar even tho values are numbers (what problems could this create?) Thanks for any help! using System; using System.Data; using System.Data.SQLite; using System.IO; using System.Text.RegularExpressions; using System.Threading; using System.Collections.Generic; using System.Linq; using System.Data.SqlClient; namespace JohnDeereCMMDataParser { internal class Program { public static List<string> GetImportedFileList() { List<string> ImportedFiles = new List<string>(); using (SqlConnection connect = new SqlConnection(@"Server=FRXSQLDEV;Database=RX_CMMData;Integrated Security=YES")) { connect.Open(); using (SqlCommand fmd = connect.CreateCommand()) { fmd.CommandText = @"SELECT FileName FROM CMMData;"; fmd.CommandType = CommandType.Text; SqlDataReader r = fmd.ExecuteReader(); while (r.Read()) { ImportedFiles.Add(Convert.ToString(r["FileName"])); } } } return ImportedFiles; } private static void Main(string[] args) { Console.Title = "John Deere CMM Data Parser"; Console.WriteLine("Preparing CMM Data Parser... done"); Console.WriteLine("Scanning for new CMM data..."); Console.ForegroundColor = ConsoleColor.Gray; using (SqlConnection con = new SqlConnection(@"Server=FRXSQLDEV;Database=RX_CMMData;Integrated Security=YES")) { con.Open(); using (SqlCommand insertCommand = con.CreateCommand()) { Console.WriteLine("Connecting to SQL server..."); SqlCommand cmdd = con.CreateCommand(); string[] files = Directory.GetFiles(@"C:\Documents and Settings\js91162\Desktop\CMM WENZEL\", "*_*_*.txt", SearchOption.AllDirectories); List<string> ImportedFiles = GetImportedFileList(); insertCommand.Parameters.Add(new SqlParameter("@FeatType", DbType.String)); insertCommand.Parameters.Add(new SqlParameter("@FeatName", DbType.String)); insertCommand.Parameters.Add(new SqlParameter("@Axis", DbType.String)); insertCommand.Parameters.Add(new SqlParameter("@Actual", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@Nominal", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@Dev", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@TolMin", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@TolPlus", DbType.Decimal)); insertCommand.Parameters.Add(new SqlParameter("@OutOfTol", DbType.Decimal)); foreach (string file in files.Except(ImportedFiles)) { var FileNameExt1 = Path.GetFileName(file); cmdd.Parameters.Clear(); cmdd.Parameters.Add(new SqlParameter("@FileExt", FileNameExt1)); cmdd.CommandText = @" IF (EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_SCHEMA = 'RX_CMMData' AND TABLE_NAME = 'CMMData')) BEGIN SELECT COUNT(*) FROM CMMData WHERE FileName = @FileExt; END"; int count = Convert.ToInt32(cmdd.ExecuteScalar()); con.Close(); con.Open(); if (count == 0) { Console.WriteLine("Preparing to parse CMM data for SQL import..."); if (file.Count(c => c == '_') > 5) continue; insertCommand.CommandText = @" INSERT INTO CMMData (FeatType, FeatName, Axis, Actual, Nominal, Dev, TolMin, TolPlus, OutOfTol, PartNumber, CMMNumber, Date, FileName) VALUES (@FeatType, @FeatName, @Axis, @Actual, @Nominal, @Dev, @TolMin, @TolPlus, @OutOfTol, @PartNumber, @CMMNumber, @Date, @FileName);"; string FileNameExt = Path.GetFullPath(file); string RNumber = Path.GetFileNameWithoutExtension(file); int index2 = RNumber.IndexOf("~"); Match RNumberE = Regex.Match(RNumber, @"^(R|L)\d{6}(COMP|CRIT|TEST|SU[1-9])(?=_)", RegexOptions.IgnoreCase); Match RNumberD = Regex.Match(RNumber, @"(?<=_)\d{3}[A-Z]\d{4}|\d{3}[A-Z]\d\w\w\d(?=_)", RegexOptions.IgnoreCase); Match RNumberDate = Regex.Match(RNumber, @"(?<=_)\d{8}(?=_)", RegexOptions.IgnoreCase); string RNumE = Convert.ToString(RNumberE); string RNumD = Convert.ToString(RNumberD); if (RNumberD.Value == @"") continue; if (RNumberE.Value == @"") continue; if (RNumberDate.Value == @"") continue; if (index2 != -1) continue; DateTime dateTime = DateTime.ParseExact(RNumberDate.Value, "yyyyMMdd", Thread.CurrentThread.CurrentCulture); string cmmDate = dateTime.ToString("dd-MMM-yyyy"); string[] lines = File.ReadAllLines(file); bool parse = false; foreach (string tmpLine in lines) { string line = tmpLine.Trim(); if (!parse && line.StartsWith("Feat. Type,")) { parse = true; continue; } if (!parse || string.IsNullOrEmpty(line)) { continue; } Console.WriteLine(tmpLine); foreach (SqlParameter parameter in insertCommand.Parameters) { parameter.Value = null; } string[] values = line.Split(new[] { ',' }); for (int i = 0; i < values.Length - 1; i++) { if (i = "" || i = null) continue; SqlParameter param = insertCommand.Parameters[i]; if (param.DbType == DbType.Decimal) { decimal value; param.Value = decimal.TryParse(values[i], out value) ? value : 0; } else { param.Value = values[i]; } } insertCommand.Parameters.Add(new SqlParameter("@PartNumber", RNumE)); insertCommand.Parameters.Add(new SqlParameter("@CMMNumber", RNumD)); insertCommand.Parameters.Add(new SqlParameter("@Date", cmmDate)); insertCommand.Parameters.Add(new SqlParameter("@FileName", FileNameExt)); insertCommand.ExecuteNonQuery(); insertCommand.Parameters.RemoveAt("@PartNumber"); insertCommand.Parameters.RemoveAt("@CMMNumber"); insertCommand.Parameters.RemoveAt("@Date"); insertCommand.Parameters.RemoveAt("@FileName"); } } } Console.WriteLine("CMM data successfully imported to SQL database..."); } con.Close(); } } } }

    Read the article

  • Customer Attribute, not sorting select options

    - by Bosworth99
    Made a module that creates some customer EAV attributes. One of these attributes is a Select, and I'm dropping a bunch of options into their respective tables. Everything lines up and is accessible on both the front end and the back. Last thing before calling this part of things finished is the sort order of the options. They come out all scrambled, instead of the obvious default or alphabetical (seemingly at random... very wierd). I'm on Mage v1.11 (Pro/Enterprise). config.xml <config> <modules> <WACI_CustomerAttr> <version>0.1.0</version> </WACI_CustomerAttr> </modules> <global> <resources> <customerattr_setup> <setup> <module>WACI_CustomerAttr</module> <class>Mage_Customer_Model_Entity_Setup</class> </setup> <connection> <use>core_setup</use> </connection> </customerattr_setup> </resources> <models> <WACI_CustomerAttr> <class>WACI_CustomerAttr_Model</class> </WACI_CustomerAttr> </models> <fieldsets> <customer_account> <agency><create>1</create><update>1</update></agency> <title><create>1</create><update>1</update></title> <phone><create>1</create><update>1</update></phone> <mailing_address><create>1</create><update>1</update></mailing_address> <city><create>1</create><update>1</update></city> <state><create>1</create><update>1</update></state> <zip><create>1</create><update>1</update></zip> <fed_id><create>1</create><update>1</update></fed_id> <ubi><create>1</create><update>1</update></ubi> </customer_account> </fieldsets> </global> </config> mysql4-install-0.1.0.php <?php Mage::log('Installing WACI_CustomerAttr'); echo 'Running Upgrade: '.get_class($this)."\n <br /> \n"; //die ( 'its running' ); $installer = $this; /* @var $installer Mage_Customer_Model_Entity_Setup */ $installer->startSetup(); // bunch of attributes // State $installer->addAttribute('customer','state', array( 'type' => 'varchar', 'group' => 'Default', 'label' => 'State', 'input' => 'select', 'default' => 'Washington', 'source' => 'WACI_CustomerAttr/customer_attribute_data_select', 'global' => Mage_Catalog_Model_Resource_Eav_Attribute::SCOPE_STORE, 'required' => true, 'visible' => true, 'user_defined' => 1, 'position' => 67 ) ); $attrS = Mage::getSingleton('eav/config')->getAttribute('customer', 'state'); $attrS->addData(array('sort_order'=>67)); $attrS->setData('used_in_forms', array('adminhtml_customer','customer_account_edit','customer_account_create'))->save(); $state_list = array('Alabama','Alaska','Arizona','Arkansas','California','Colorado','Connecticut','Delaware','Florida','Georgia', 'Hawaii','Idaho','Illinois','Indiana','Iowa','Kansas','Kentucky','Louisiana','Maine','Maryland','Massachusetts','Michigan', 'Minnesota','Mississippi','Missouri','Montana','Nebraska','Nevada','New Hampshire','New Jersey','New Mexico','New York', 'North Carolina','North Dakota','Ohio','Oklahoma','Oregon','Pennsylvania','Rhode Island','South Carolina','South Dakota', 'Tennessee','Texas','Utah','Vermont','Virginia','Washington','West Virginia','Wisconsin','Wyoming'); $aOption = array(); $aOption['attribute_id'] = $installer->getAttributeId('customer', 'state'); for($iCount=0;$iCount<sizeof($state_list);$iCount++){ $aOption['value']['option'.$iCount][0] = $state_list[$iCount]; } $installer->addAttributeOption($aOption); // a few more $installer->endSetup(); app/code/local/WACI/CustomerAttr/Model/Customer/Attribute/Data/Select.php <?php class WACI_CustomerAttr_Model_Customer_Attribute_Data_Select extends Mage_Eav_Model_Entity_Attribute_Source_Abstract{ function getAllOptions(){ if (is_null($this->_options)) { $this->_options = Mage::getResourceModel('eav/entity_attribute_option_collection') ->setAttributeFilter($this->getAttribute()->getId()) ->setStoreFilter($this->getAttribute()->getStoreId()) ->setPositionOrder('asc') ->load() ->toOptionArray(); } $options = $this->_options; return $options; } } theme/variation/template/persistent/customer/form/register.phtml <li> <?php $attribute = Mage::getModel('eav/config')->getAttribute('customer','state'); ?> <label for="state" class="<?php if($attribute->getIsRequired() == true){?>required<?php } ?>"><?php if($attribute->getIsRequired() == true){?><em>*</em><?php } ?><?php echo $this->__('State') ?></label> <div class="input-box"> <select name="state" id="state" class="<?php if($attribute->getIsRequired() == true){?>required-entry<?php } ?>"> <?php $options = $attribute->getSource()->getAllOptions(); foreach($options as $option){ ?> <option value='<?php echo $option['value']?>' <?php if($this->getFormData()->getState() == $option['value']){ echo 'selected="selected"';}?>><?php echo $this->__($option['label'])?></option> <?php } ?> </select> </div> </li> All options are getting loaded into table eav_attribute_option just fine (albeit without a sort_order defined), as well as table eav_attribute_option_value. In the adminhtml / customer-manage customers-account information this select is showing up fine (but its delivered automatically by the system). Seems I should be able to set the sort-order on creation of the attributeOptions, or, certainly, define the sort order in the data/select class. But nothing I've tried works. I'd rather not do a front-end hack either... Oh, and how do I set the default value of this select? (Different question, I know, but related). Setting the attributes 'default' = 'washington' seems to do nothing. There seem to be a lot of ways to set up attribute select options like this. Is there a better way that the one I've outlined here? Perhaps I'm messing something up. Cheers

    Read the article

< Previous Page | 437 438 439 440 441 442 443  | Next Page >