Search Results

Search found 1043 results on 42 pages for 'sri kumar'.

Page 1/42 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • encfs error while decoding the data

    - by migrator
    I have installed encfs and started using it to secure all my personal & office data and it was working absolutely fine until 2 hours back. The setup is like this. I have a folder in Copy folder called OfficeData which gets synchronized with my Copy folder When I login into the system I use the command encfs ~/Copy/OfficeData ~/Documents/OfficeData Once my work is over I dismount with the command fusermount -u ~/Documents/OfficeData All this data get synchronized with my desktop and with my mobile phone (as a backup) Today when I mounted, the folder got mounted by no directories and files present in that folder. I was worried and read man encfs which gave me to run the command encfs -v -f ~/Copy/OfficeData ~/Documents/OfficeData 2> encfs-OfficeData-report.txt. The below is the output of the file encfs-OfficeData-report.txt. The directory "/home/sri/Documents/OfficeData" does not exist. Should it be created? (y,n) 13:16:26 (main.cpp:523) Root directory: /home/sri/Copy/OfficeData/ 13:16:26 (main.cpp:524) Fuse arguments: (fg) (threaded) (keyCheck) encfs /home/sri/Documents/OfficeData -f -s -o use_ino -o default_permissions 13:16:26 (FileUtils.cpp:177) version = 20 13:16:26 (FileUtils.cpp:181) found new serialization format 13:16:26 (FileUtils.cpp:199) subVersion = 20100713 13:16:26 (Interface.cpp:165) checking if ssl/aes(3:0:2) implements ssl/aes(3:0:0) 13:16:26 (SSL_Cipher.cpp:370) allocated cipher ssl/aes, keySize 32, ivlength 16 13:16:26 (Interface.cpp:165) checking if ssl/aes(3:0:2) implements ssl/aes(3:0:0) 13:16:26 (SSL_Cipher.cpp:370) allocated cipher ssl/aes, keySize 32, ivlength 16 13:16:26 (FileUtils.cpp:1620) useStdin: 0 13:16:46 (Interface.cpp:165) checking if ssl/aes(3:0:2) implements ssl/aes(3:0:0) 13:16:46 (SSL_Cipher.cpp:370) allocated cipher ssl/aes, keySize 32, ivlength 16 13:16:49 (FileUtils.cpp:1628) cipher key size = 52 13:16:49 (Interface.cpp:165) checking if nameio/block(3:0:1) implements nameio/block(3:0:0) 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn: No such file or directory 13:16:49 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 4188221457101129840, fileIV = 0 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn: No such file or directory 13:16:49 (encfs.cpp:138) getattr error: No such file or directory 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF: No such file or directory 13:16:49 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 16725694203599486310, fileIV = 0 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF: No such file or directory 13:16:49 (encfs.cpp:138) getattr error: No such file or directory 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/tVglci2rgp9o8qE-m9AvX6JNj1lQs-ER0OvnxfOb30Z,3,: No such file or directory 13:16:49 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 1354483141023495884, fileIV = 0 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/tVglci2rgp9o8qE-m9AvX6JNj1lQs-ER0OvnxfOb30Z,3, 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/tVglci2rgp9o8qE-m9AvX6JNj1lQs-ER0OvnxfOb30Z,3, 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/tVglci2rgp9o8qE-m9AvX6JNj1lQs-ER0OvnxfOb30Z,3,: No such file or directory 13:16:49 (encfs.cpp:138) getattr error: No such file or directory 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn: No such file or directory 13:16:49 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 16720606331386655431, fileIV = 0 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn: No such file or directory 13:16:49 (encfs.cpp:138) getattr error: No such file or directory 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:16:49 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:16:49 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:16:49 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:16:49 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:16:49 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:16:49 (FileNode.cpp:127) calling setIV on (null) 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn: No such file or directory 13:16:49 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 16720606331386655431, fileIV = 0 13:16:49 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn 13:16:49 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn 13:16:49 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/r1KIEqVkz-,7-6CobavHCSNn: No such file or directory 13:16:49 (encfs.cpp:138) getattr error: No such file or directory 13:19:31 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:19:31 (FileNode.cpp:127) calling setIV on (null) 13:19:31 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/ 13:19:31 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/ 13:19:31 (encfs.cpp:685) doing statfs of /home/sri/Copy/OfficeData 13:19:32 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:19:32 (FileNode.cpp:127) calling setIV on (null) 13:19:32 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/LuT8R,DlpRnNH9b,fjWiKHKc: No such file or directory 13:19:32 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 13735228085838055696, fileIV = 0 13:19:32 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/LuT8R,DlpRnNH9b,fjWiKHKc 13:19:32 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/LuT8R,DlpRnNH9b,fjWiKHKc 13:19:32 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/LuT8R,DlpRnNH9b,fjWiKHKc: No such file or directory 13:19:32 (encfs.cpp:138) getattr error: No such file or directory 13:19:32 (encfs.cpp:685) doing statfs of /home/sri/Copy/OfficeData 13:19:32 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:19:32 (FileNode.cpp:127) calling setIV on (null) 13:19:32 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn: No such file or directory 13:19:32 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 4188221457101129840, fileIV = 0 13:19:32 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn 13:19:32 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn 13:19:32 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/UWbT-M-UKk1JpvNfN5uvOhGn: No such file or directory 13:19:32 (encfs.cpp:138) getattr error: No such file or directory 13:19:32 (MACFileIO.cpp:75) fs block size = 1024, macBytes = 8, randBytes = 0 13:19:32 (FileNode.cpp:127) calling setIV on (null) 13:19:32 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF: No such file or directory 13:19:32 (CipherFileIO.cpp:105) in setIV, current IV = 0, new IV = 16725694203599486310, fileIV = 0 13:19:32 (DirNode.cpp:770) created FileNode for /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF 13:19:32 (encfs.cpp:134) getattr /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF 13:19:32 (RawFileIO.cpp:191) getAttr error on /home/sri/Copy/OfficeData/o94olxB3orqarqyFviHKZ,ZF: No such file or directory 13:19:32 (encfs.cpp:138) getattr error: No such file or directory 13:19:32 (encfs.cpp:213) getdir on /home/sri/Copy/OfficeData/ 13:19:32 (BlockNameIO.cpp:185) padding, _bx, finalSize = 208, 16, -192 13:19:32 (DirNode.cpp:132) error decoding filename: eWJrLh2dRFAY-7Brbsc,mTqf 13:19:32 (DirNode.cpp:132) error decoding filename: .encfs6.xml 13:19:32 (BlockNameIO.cpp:185) padding, _bx, finalSize = 218, 16, -202 13:19:32 (DirNode.cpp:132) error decoding filename: pvph9DkZ0BMPg2vN4UcfwuNU 13:24:10 (openssl.cpp:48) Allocating 41 locks for OpenSSL Please help me Thanks in advance.

    Read the article

  • SQLAuthority News – A Conversation with an Old Friend – Sri Sridharan

    - by pinaldave
    Sri Sridharan is my old friend and we often talk on GTalk. The subject varies from Life in India/USA, movies, musics, and of course SQL. We have our differences when we talk about food or movie but we always agree when we talk about SQL. Yesterday while chatting with him we talked about SQLPASS and the conversation lasted for a long time. Here is the conversation between us on GTalk. I have removed a few of the personal talks and formatted into paragraphs as GTalk often shows stuff out of formatting. Pinal: Sri, Congrats on running for the PASS BoD again. You were so close last year. What made you decide to run again this year? Sri: Thank you Pinal for your leadership in the PASS India Community and all the things you do out there. After coming so close last year, there was no doubt in my mind that I will run again. I was truly humbled by the support I got from the community. Growing up in India for over 25 years, you are brought up in a very competitive part of the world. Right from the pressure of staying in the top of the class from kindergarten to your graduation, the relentless push from your parents about studying and getting good grades (and nothing else matters), you land up essentially living in a pressure cooker. To survive that relentless pressure, you need to have a thick skin, ability to stand up for who you really are , what you want to accomplish and in the process stay true those values. I am striving for a greater cause, to make PASS an organization that can help people with their SQL Server careers, to make PASS relevant to its chapter members, to make PASS an organization that every SQL professional in the world wants to be connected with. Just because I did not get elected or appointed last year does not mean that these causes are not worth fighting. Giving up upon failing the first time is simply not in me. If I did that, what message would I send to those who voted for me? What message would I send to my kids? Pinal: As someone who has such strong roots in India, what can the Indian PASS Community expect from you? Sri: First of all, I think fostering a regional leadership is something PASS must encourage as part of its global growth plan. For PASS global being able to understand all the issues in a region of the world and make sound decisions will be a tough thing to do on a continuous basis. I expect people like you, chapter leaders, regional mentors, MVPs of the region start playing a bigger role in shaping the next generation of PASS. That is something I said in my campaign and I still stand by it. I would like to see growth in the number of chapters in India. The current count does not truly represent the full potential of that region. I was pretty thrilled to see the Bangalore SQLSaturday happen early this year. I would like to see more of SQLSaturday events, at least in the major metro cities. I know the issues in India are very different from the rest of the world. So the formula needs to be tweaked a little for it to work better in India. Once the SQLSaturday model is vetted out, maybe there could be enough justification to have SQLRally India. PASS needs to have a premier SQL event in that region. Going to USA or Europe for that matter is incredibly hard due to VISA issues etc. So this could be a case of where PASS comes closer to where the community is. Pinal: What portfolio would take on if you are elected to the PASS Board? Sri: There are some very strong folks on the PASS Board today. The President discusses the portfolios with the group and makes the final call on the portfolios. I am also a fan of having a team associated with the portfolios. In that case, one person is the primary for a portfolio but secondary on a couple of other portfolios. This way people on the board have a direct vested interest in a few portfolios. Personally, I know I would these portfolios good justice – Chapters, Global Growth and Events (SQLSat, SQLRally). I would try to see if we can get a director to focus on Volunteers.  To me that is very critical for growth in the international regions. Pinal: This is an interesting conversation with you Sri. I know you so long time but this is indeed inspiring to many. India is a big country and we appreciate your thoughts. Sri: Thank you very much for taking time to chat with me today. Cheers. There are pretty strong candidates for SQLPASS Board of Elections this year. I know all of them in person and honestly it is going to be extremely difficult to not to vote for anybody. I am indeed in a crunch right now how to pick one over another. Though the choice is difficult, I encourage you to vote for them. I am extremely confident that the new board of directors will all have the same goal – Better SQL Server Community. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, DBA, PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • 2011 PASS Board Applicants: Sri Sridharan

    - by andyleonard
    Introduction I am interviewing 2011 PASS Board Nominee Applicants. As listed on the PASS Board Elections site the applicants are: Rob Farley Geoff Hiten Adam Jorgensen Denise McInerney Sri Sridharan Kendal Van Dyke I'm asking everyone the same questions and blogging the responses in the order received. Sri Sridharan is next up: Interview With Sri Sridharan 1. What's your day job? I work for VHA as a Data Architect. I am responsible for 3 main goals. · Responsible for Data Governance initiatives in...(read more)

    Read the article

  • SQLAuthority News – Reliving TechEd with Vinod Kumar at Bangalore User Groups

    - by pinaldave
    TechEd India 2012 was held in Bangalore last March 21 to 23, 2012. Just like every year, this event is bigger, grander and inspiring. Here is my blog post reviewing the event SQLAuthority News – #TechEdIn – TechEd India 2012 Memories and Photos. For me this is family event – I get to meet my friends who are dear as my family. I like to call User Groups as family too. Family shares life’s personal happiness and experience – the same way User Group shares professional experiences and quite often UG members become just like family member. When I learned that follower user group together building up a unique event I was pretty excited to learn who is going to be speaker for the event. BDotNet.in – Bangalore .NET Usergroup BITPro.in – Bangalore ITPro Usergroup It was indeed joy when I learned that presenter will be Vinod Kumar, who is integral part of user groups and hardcore SQL Server enthusiast. Vinod Kumar is going to present on following two sessions which are both focused on internals of the Windows and SQL Server. Understanding Windows with SysInternals Tools – This session will cover various tools from usage of Memory, x86 architecture, x64, WOW mode, Page faults, Virtual Memory mapping, OOM scenario, Perf Tool, PAL tool, Logman and more. Peeling the Onion: SQL Server Internals Demystified – This session will cover advanced disk formats, SQL Server 2012 security changes, memory changes, indirect checkPoint and more. I am very excited as this time I will get opportunity to sit in front rows (as I will be reaching there to get best possible position) and learn. I am looking forward to the event and I hope you will join us as well. Event Details: Date: Saturday, April 7, 2012 (10:30am until 1:30pm) Venue: Microsoft, Domlur, Bangalore. Event Details: https://www.facebook.com/events/139444029517882/ This session is FREE for all and everybody and anybody can walk in. Community Blog Posts Here are few of the blog post written by the community on this subject. Vinod Kumar on Reliving #TechEdIn at Blr UG Manas Dash on Reliving TechEd India 2012 with Vinod Kumar Sudeepta Ganguly on SysInternals n SQLInternals with Vinod Kumar Lohith Re Live TechEd India 2012 with Vinod Kumar  Reference: Pinal Dave (http://blog.sqlauthority.com) http://www.youtube.com/watch?v=oRw-p4mahLU Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, T SQL, Technology, Video

    Read the article

  • SQL Server – Learning SQL Server Performance: Indexing Basics – Interview of Vinod Kumar by Pinal Dave

    - by pinaldave
    Recently I just wrote a blog post on about Learning SQL Server Performance: Indexing Basics and I received lots of request that if we can share some insight into the course. Every single time when Performance is discussed, Indexes are mentioned along with it. In recent times, data and application complexity is continuously growing.  The demand for faster query response, performance, and scalability by organizations is increasing and developers and DBAs need to now write efficient code to achieve this. When we developed the course – we made sure that this course remains practical and demo heavy instead of just theories on this subject. Vinod Kumar and myself we often thought about this and realized that practical understanding of the indexes is very important. One can not master every single aspects of the index. However there are some minimum expertise one should gain if performance is one of the concern. Here is 200 seconds interview of Vinod Kumar I took right after completing the course. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Index, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology, Video

    Read the article

  • SQL SERVER – Concurrency Basics – Guest Post by Vinod Kumar

    - by pinaldave
    This guest post is by Vinod Kumar. Vinod Kumar has worked with SQL Server extensively since joining the industry over a decade ago. Working on various versions from SQL Server 7.0, Oracle 7.3 and other database technologies – he now works with the Microsoft Technology Center (MTC) as a Technology Architect. Let us read the blog post in Vinod’s own voice. Learning is always fun when it comes to SQL Server and learning the basics again can be more fun. I did write about Transaction Logs and recovery over my blogs and the concept of simplifying the basics is a challenge. In the real world we always see checks and queues for a process – say railway reservation, banks, customer supports etc there is a process of line and queue to facilitate everyone. Shorter the queue higher is the efficiency of system (a.k.a higher is the concurrency). Every database does implement this using checks like locking, blocking mechanisms and they implement the standards in a way to facilitate higher concurrency. In this post, let us talk about the topic of Concurrency and what are the various aspects that one needs to know about concurrency inside SQL Server. Let us learn the concepts as one-liners: Concurrency can be defined as the ability of multiple processes to access or change shared data at the same time. The greater the number of concurrent user processes that can be active without interfering with each other, the greater the concurrency of the database system. Concurrency is reduced when a process that is changing data prevents other processes from reading that data or when a process that is reading data prevents other processes from changing that data. Concurrency is also affected when multiple processes are attempting to change the same data simultaneously. Two approaches to managing concurrent data access: Optimistic Concurrency Model Pessimistic Concurrency Model Concurrency Models Pessimistic Concurrency Default behavior: acquire locks to block access to data that another process is using. Assumes that enough data modification operations are in the system that any given read operation is likely affected by a data modification made by another user (assumes conflicts will occur). Avoids conflicts by acquiring a lock on data being read so no other processes can modify that data. Also acquires locks on data being modified so no other processes can access the data for either reading or modifying. Readers block writer, writers block readers and writers. Optimistic Concurrency Assumes that there are sufficiently few conflicting data modification operations in the system that any single transaction is unlikely to modify data that another transaction is modifying. Default behavior of optimistic concurrency is to use row versioning to allow data readers to see the state of the data before the modification occurs. Older versions of the data are saved so a process reading data can see the data as it was when the process started reading and not affected by any changes being made to that data. Processes modifying the data is unaffected by processes reading the data because the reader is accessing a saved version of the data rows. Readers do not block writers and writers do not block readers, but, writers can and will block writers. Transaction Processing A transaction is the basic unit of work in SQL Server. Transaction consists of SQL commands that read and update the database but the update is not considered final until a COMMIT command is issued (at least for an explicit transaction: marked with a BEGIN TRAN and the end is marked by a COMMIT TRAN or ROLLBACK TRAN). Transactions must exhibit all the ACID properties of a transaction. ACID Properties Transaction processing must guarantee the consistency and recoverability of SQL Server databases. Ensures all transactions are performed as a single unit of work regardless of hardware or system failure. A – Atomicity C – Consistency I – Isolation D- Durability Atomicity: Each transaction is treated as all or nothing – it either commits or aborts. Consistency: ensures that a transaction won’t allow the system to arrive at an incorrect logical state – the data must always be logically correct.  Consistency is honored even in the event of a system failure. Isolation: separates concurrent transactions from the updates of other incomplete transactions. SQL Server accomplishes isolation among transactions by locking data or creating row versions. Durability: After a transaction commits, the durability property ensures that the effects of the transaction persist even if a system failure occurs. If a system failure occurs while a transaction is in progress, the transaction is completely undone, leaving no partial effects on data. Transaction Dependencies In addition to supporting all four ACID properties, a transaction might exhibit few other behaviors (known as dependency problems or consistency problems). Lost Updates: Occur when two processes read the same data and both manipulate the data, changing its value and then both try to update the original data to the new value. The second process might overwrite the first update completely. Dirty Reads: Occurs when a process reads uncommitted data. If one process has changed data but not yet committed the change, another process reading the data will read it in an inconsistent state. Non-repeatable Reads: A read is non-repeatable if a process might get different values when reading the same data in two reads within the same transaction. This can happen when another process changes the data in between the reads that the first process is doing. Phantoms: Occurs when membership in a set changes. It occurs if two SELECT operations using the same predicate in the same transaction return a different number of rows. Isolation Levels SQL Server supports 5 isolation levels that control the behavior of read operations. Read Uncommitted All behaviors except for lost updates are possible. Implemented by allowing the read operations to not take any locks, and because of this, it won’t be blocked by conflicting locks acquired by other processes. The process can read data that another process has modified but not yet committed. When using the read uncommitted isolation level and scanning an entire table, SQL Server can decide to do an allocation order scan (in page-number order) instead of a logical order scan (following page pointers). If another process doing concurrent operations changes data and move rows to a new location in the table, the allocation order scan can end up reading the same row twice. Also can happen if you have read a row before it is updated and then an update moves the row to a higher page number than your scan encounters later. Performing an allocation order scan under Read Uncommitted can cause you to miss a row completely – can happen when a row on a high page number that hasn’t been read yet is updated and moved to a lower page number that has already been read. Read Committed Two varieties of read committed isolation: optimistic and pessimistic (default). Ensures that a read never reads data that another application hasn’t committed. If another transaction is updating data and has exclusive locks on data, your transaction will have to wait for the locks to be released. Your transaction must put share locks on data that are visited, which means that data might be unavailable for others to use. A share lock doesn’t prevent others from reading but prevents them from updating. Read committed (snapshot) ensures that an operation never reads uncommitted data, but not by forcing other processes to wait. SQL Server generates a version of the changed row with its previous committed values. Data being changed is still locked but other processes can see the previous versions of the data as it was before the update operation began. Repeatable Read This is a Pessimistic isolation level. Ensures that if a transaction revisits data or a query is reissued the data doesn’t change. That is, issuing the same query twice within a transaction cannot pickup any changes to data values made by another user’s transaction because no changes can be made by other transactions. However, this does allow phantom rows to appear. Preventing non-repeatable read is a desirable safeguard but cost is that all shared locks in a transaction must be held until the completion of the transaction. Snapshot Snapshot Isolation (SI) is an optimistic isolation level. Allows for processes to read older versions of committed data if the current version is locked. Difference between snapshot and read committed has to do with how old the older versions have to be. It’s possible to have two transactions executing simultaneously that give us a result that is not possible in any serial execution. Serializable This is the strongest of the pessimistic isolation level. Adds to repeatable read isolation level by ensuring that if a query is reissued rows were not added in the interim, i.e, phantoms do not appear. Preventing phantoms is another desirable safeguard, but cost of this extra safeguard is similar to that of repeatable read – all shared locks in a transaction must be held until the transaction completes. In addition serializable isolation level requires that you lock data that has been read but also data that doesn’t exist. Ex: if a SELECT returned no rows, you want it to return no. rows when the query is reissued. This is implemented in SQL Server by a special kind of lock called the key-range lock. Key-range locks require that there be an index on the column that defines the range of values. If there is no index on the column, serializable isolation requires a table lock. Gets its name from the fact that running multiple serializable transactions at the same time is equivalent of running them one at a time. Now that we understand the basics of what concurrency is, the subsequent blog posts will try to bring out the basics around locking, blocking, deadlocks because they are the fundamental blocks that make concurrency possible. Now if you are with me – let us continue learning for SQL Server Locking Basics. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Concurrency

    Read the article

  • Rapidly Deploy Oracle Applications with Oracle VM Templates

    - by monica.kumar
    Oracle today announced Oracle VM Templates for a number of Oracle Applications including Oracle E-Business Suite 12.1 Oracle's JD Edwards Enterprise One 9.0 Oracle's PeopleSoft 9.1 These Oracle VM Templates, based on Oracle Enterprise Linux, provide pre-installed and pre-configured enterprise software images that help eliminate the need to install new software from scratch, offering customers a time-saving approach to deploying a fully configured software stack. Learn more about Oracle VM Templates

    Read the article

  • Linux.com: Q&A on Oracle's Unbreakable Enterprise Kernel

    - by monica.kumar
    Linux.com recently published a Q&A on Oracle's Unbreakable Enterprise Kernel for Linux. The interview highlights the key benefits of Oracle's new offering and also offers an insight into our long and ongoing commitment to advancing Linux. Here are some excerpts from the Q&A: All enhancements made in the Unbreakable Enterprise Kernel are open source and have been made available to the Linux community. Oracle Linux, including both the kernels, is free to download, use and distribute. You can download the Unbreakable Enterprise Kernel at http://public-yum.oracle.com Source code is available, including a public git repository with full changelog and individual patches and checkins for convenience. Read the entire interview. Visit the Oracle Linux Homepage.

    Read the article

  • New Reference Configuration: Accelerate Deployment of Virtual Infrastructure

    - by monica.kumar
    Today, Oracle announced the availability of Oracle VM blade cluster reference configuration based on Sun servers, storage and Oracle VM software. Assembling and integrating software and hardware systems from different vendors can be a huge barrier to deploying virtualized infrastructures as it is often a complicated, time-consuming, risky and expensive process. Using this tested configuration can help reduce the time to configure and deploy a virtual infrastructure by up to 98% as compared to putting together multi-vendor configurations. Once ready, the infrastructure can be used to easily deploy enterprise applications in a matter of minutes to hours as opposed to days/weeks, by using Oracle VM Templates. Find out more: Press Release Business whitepaper Technical whitepaper

    Read the article

  • how to group data in a list

    - by prince23
    I need to group data of a list in c# ex: i have a data like this in a list c# i have a class called information.cs with these properties name,school, parent ex data name school parent kumar fes All manju fes kumar anu frank kumar anitha jss All rohit frank manju anill vijaya manju vani jss kumar soumya jss kumar madhu jss rohit shiva jss rohit vanitha jss anitha anu jss anitha now taking this as an input i wanted the output to be formated with a Hierarchical data when parent is all means it is the topmost level kumar fes All. what i need to do here is i need to create an object[0] and then check in list whether kumar exists as a parent in the list if it exista then add those items as under the object[0] as a parent i need to create one more oject under **manju fes kumar anu frank kumar** what i wanted do here is iterate through the list anD then check the parent level based on name school parent kumar fes All -->obj[0] manju fes kumar -->obj1[0] anu frank kumar -->obj1[1] for obj1-- obj[0] will be parent like this i need to genarte a list or observation class anitha jss All-->obj[1] vanitha jss anitha -->obj1[0] anu jss vanitha -->obj2[0] here obj2[0]--obj1[0]--obj[1] will be an parent like this i need to create a list or an observationclass hope my Question is clear what i am trying ask you people. i wanted to know how i can create an observationclass. any help would be really great. hope my question is clear

    Read the article

  • how to group data in a list c#

    - by prince23
    hi, i need to group data of a list in c# ex: i have a data like this in a list c# i have a class called information.cs with these properties name,school, parent ex data name school parent kumar fes All manju fes kumar anu frank kumar anitha jss All rohit frank manju anill vijaya manju vani jss kumar soumya jss kumar madhu jss rohit shiva jss rohit vanitha jss anitha anu jss anitha now taking this as an input i wanted the output to be formated with a Hierarchical data when parent is all means it is the topmost level kumar fes All. what i need to do here is i need to create an object[0] and then check in list whether kumar exists as a parent in the list if it exista then add those items as under the object[0] as a parent i need to create one more oject under **manju fes kumar anu frank kumar** what i wanted do here is iterate through the list anD then check the parent level based on name school parent kumar fes All -->obj[0] manju fes kumar -->obj1[0] anu frank kumar -->obj1[1] for obj1-- obj[0] will be parent like this i need to genarte a list or observation class anitha jss All-->obj[1] vanitha jss anitha -->obj1[0] anu jss vanitha -->obj2[0] here obj2[0]--obj1[0]--obj[1] will be an parent like this i need to create a list or an observationclass hope my Question is clear what i am trying ask you people. i wanted to know how i can create an observationclass any help would be really great thanks prince hope my question is clear

    Read the article

  • Big Data – Is Big Data Relevant to me? – Big Data Questionnaires – Guest Post by Vinod Kumar

    - by Pinal Dave
    This guest post is by Vinod Kumar. Vinod Kumar has worked with SQL Server extensively since joining the industry over a decade ago. Working on various versions of SQL Server 7.0, Oracle 7.3 and other database technologies – he now works with the Microsoft Technology Center (MTC) as a Technology Architect. Let us read the blog post in Vinod’s own voice. I think the series from Pinal is a good one for anyone planning to start on Big Data journey from the basics. In my daily customer interactions this buzz of “Big Data” always comes up, I react generally saying – “Sir, do you really have a ‘Big Data’ problem or do you have a big Data problem?” Generally, there is a silence in the air when I ask this question. Data is everywhere in organizations – be it big data, small data, all data and for few it is bad data which is same as no data :). Wow, don’t discount me as someone who opposes “Big Data”, I am a big supporter as much as I am a critic of the abuse of this term by the people. In this post, I wanted to let my mind flow so that you can also think in the direction I want you to see these concepts. In any case, this is not an exhaustive dump of what is in my mind – but you will surely get the drift how I am going to question Big Data terms from customers!!! Is Big Data Relevant to me? Many of my customers talk to me like blank whiteboard with no idea – “why Big Data”. They want to jump into the bandwagon of technology and they want to decipher insights from their unexplored data a.k.a. unstructured data with structured data. So what are these industry scenario’s that come to mind? Here are some of them: Financials Fraud detection: Banks and Credit cards are monitoring your spending habits on real-time basis. Customer Segmentation: applies in every industry from Banking to Retail to Aviation to Utility and others where they deal with end customer who consume their products and services. Customer Sentiment Analysis: Responding to negative brand perception on social or amplify the positive perception. Sales and Marketing Campaign: Understand the impact and get closer to customer delight. Call Center Analysis: attempt to take unstructured voice recordings and analyze them for content and sentiment. Medical Reduce Re-admissions: How to build a proactive follow-up engagements with patients. Patient Monitoring: How to track Inpatient, Out-Patient, Emergency Visits, Intensive Care Units etc. Preventive Care: Disease identification and Risk stratification is a very crucial business function for medical. Claims fraud detection: There is no precise dollars that one can put here, but this is a big thing for the medical field. Retail Customer Sentiment Analysis, Customer Care Centers, Campaign Management. Supply Chain Analysis: Every sensors and RFID data can be tracked for warehouse space optimization. Location based marketing: Based on where a check-in happens retail stores can be optimize their marketing. Telecom Price optimization and Plans, Finding Customer churn, Customer loyalty programs Call Detail Record (CDR) Analysis, Network optimizations, User Location analysis Customer Behavior Analysis Insurance Fraud Detection & Analysis, Pricing based on customer Sentiment Analysis, Loyalty Management Agents Analysis, Customer Value Management This list can go on to other areas like Utility, Manufacturing, Travel, ITES etc. So as you can see, there are obviously interesting use cases for each of these industry verticals. These are just representative list. Where to start? A lot of times I try to quiz customers on a number of dimensions before starting a Big Data conversation. Are you getting the data you need the way you want it and in a timely manner? Can you get in and analyze the data you need? How quickly is IT to respond to your BI Requests? How easily can you get at the data that you need to run your business/department/project? How are you currently measuring your business? Can you get the data you need to react WITHIN THE QUARTER to impact behaviors to meet your numbers or is it always “rear-view mirror?” How are you measuring: The Brand Customer Sentiment Your Competition Your Pricing Your performance Supply Chain Efficiencies Predictive product / service positioning What are your key challenges of driving collaboration across your global business?  What the challenges in innovation? What challenges are you facing in getting more information out of your data? Note: Garbage-in is Garbage-out. Hold good for all reporting / analytics requirements Big Data POCs? A number of customers get into the realm of setting a small team to work on Big Data – well it is a great start from an understanding point of view, but I tend to ask a number of other questions to such customers. Some of these common questions are: To what degree is your advanced analytics (natural language processing, sentiment analysis, predictive analytics and classification) paired with your Big Data’s efforts? Do you have dedicated resources exploring the possibilities of advanced analytics in Big Data for your business line? Do you plan to employ machine learning technology while doing Advanced Analytics? How is Social Media being monitored in your organization? What is your ability to scale in terms of storage and processing power? Do you have a system in place to sort incoming data in near real time by potential value, data quality, and use frequency? Do you use event-driven architecture to manage incoming data? Do you have specialized data services that can accommodate different formats, security, and the management requirements of multiple data sources? Is your organization currently using or considering in-memory analytics? To what degree are you able to correlate data from your Big Data infrastructure with that from your enterprise data warehouse? Have you extended the role of Data Stewards to include ownership of big data components? Do you prioritize data quality based on the source system (that is Facebook/Twitter data has lower quality thresholds than radio frequency identification (RFID) for a tracking system)? Do your retention policies consider the different legal responsibilities for storing Big Data for a specific amount of time? Do Data Scientists work in close collaboration with Data Stewards to ensure data quality? How is access to attributes of Big Data being given out in the organization? Are roles related to Big Data (Advanced Analyst, Data Scientist) clearly defined? How involved is risk management in the Big Data governance process? Is there a set of documented policies regarding Big Data governance? Is there an enforcement mechanism or approach to ensure that policies are followed? Who is the key sponsor for your Big Data governance program? (The CIO is best) Do you have defined policies surrounding the use of social media data for potential employees and customers, as well as the use of customer Geo-location data? How accessible are complex analytic routines to your user base? What is the level of involvement with outside vendors and third parties in regard to the planning and execution of Big Data projects? What programming technologies are utilized by your data warehouse/BI staff when working with Big Data? These are some of the important questions I ask each customer who is actively evaluating Big Data trends for their organizations. These questions give you a sense of direction where to start, what to use, how to secure, how to analyze and more. Sign off Any Big data is analysis is incomplete without a compelling story. The best way to understand this is to watch Hans Rosling – Gapminder (2:17 to 6:06) videos about the third world myths. Don’t get overwhelmed with the Big Data buzz word, the destination to what your data speaks is important. In this blog post, we did not particularly look at any Big Data technologies. This is a set of questionnaire one needs to keep in mind as they embark their journey of Big Data. I did write some of the basics in my blog: Big Data – Big Hype yet Big Opportunity. Do let me know if these questions make sense?  Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – Transcript of Learning SQL Server Performance: Indexing Basics – Interview of Vinod Kumar by Pinal Dave

    - by pinaldave
    Recently I just wrote a blog post on about Learning SQL Server Performance: Indexing Basics and I received lots of request that if we can share some insight into the course. Here is 200 seconds interview of Vinod Kumar I took right after completing the course. We have few free codes to watch the course, please your comment at http://facebook.com/SQLAuth and we will few of first ones, we will send the code. There are many people who said they would like to read the transcript of the video. Here I have generated the same. Pinal: Vinod, we recently released this course, SQL Server Indexing. It is about performance tuning. So tell me – how do indexes help performance? Vinod: I think what happens in the industry when it comes to performance is that developers and DBAs look at indexes first.  So that’s the first step for any performance tuning exercise, indexing is one of the most critical aspects and it is important to learn it the right way. Pinal: Correct. So what you mean to say is that if you know indexing you can pretty much tune any server and query. Vinod: So I might contradict my false statement now. Indexing is usually a stepping stone but it does not lead you to the end. But it’s good to start with indexing and there are lots of nuances to indexing that you need to understand, like how SQL uses indexing and how performance can improve because of the strategies that you have made. Pinal: But now I’m confused. First you said indexes are good, and then you said that indexes can degrade your performance.  So what is this course about?  I mean how does this course really make an impact? Vinod: Ok -so from the course perspective, what we are trying to do is give you a capsule which gives you a good start. Every journey needs a beginning, you need that first step.  This course is that first step in understanding. This is the most basic, fundamental course that we have tried to attack. This is the fundamentals of indexing, some of the key things that you must know about indexing.   Some of the basics of indexing are lesser known and so I think this course is geared towards each and every one of you out there who wants to understand little bit more about indexing. Pinal: So what I understand is that if I enrolled in this course I will have a minimum understanding about indexing when dealing with performance tuning.  Right? Vinod: Exactly. In this course is we have tried to give you a nice summary. We are talking about clustered indexing, non clustered indexing, too many indexes, too few indexes, over indexing, under indexing, duplicate indexing, columns tune indexing, with SQL Server 2012. There’s lot’s to learn. Pinal: You can see the URL [http://bit.ly/sql-index] of the course on the screen. Go ahead, attend, and let us know what you think about it. Thank you. Vinod: Thank you. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Index, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology, Video

    Read the article

  • SQL SERVER – #TechEdIn – Presenting Tomorrow on SQL Server Misconception and Resolution with Vinod Kumar at TechEd India 2012

    - by pinaldave
    I am excited AND nervous at the same time. I am going to present a very interesting topic tomorrow at an SQL Server track in India. This will be my fourth time presenting at TechEd India. So far, I have received so much feedback about this one session. It seems like every single person out there has their own wishes and requests. I am sure that it is going to very challenging experience to satisfy everyone who attends the event through my presentation. Surprise Element Here is the good news: I am going to co-present this session with Vinod Kumar, my long time friend and co-worker. We have known each other for almost four years now, but this is the very first time that we are going to present together on the big stage of TechEd.  When there are more than two presenters, the usual trick is to practice the session multiple times and know exactly what each other is going to present and talk about. However, there’s a catch – we decided to make it different this time and have shared nothing to each other regarding what exactly we are going to present. This makes everything extremely interesting as each of us will be as clueless as the audience when other person is going to talk. Action Item Here are a few of the action items for all of those who are going to attend this session. Vinod and I will be present at the venue 15 minutes before the session. Do come in early and talk with us. We would be glad to talk with you and see if either of us can accommodate your suggestion in our session. If we do, we will give a surprise gift for you. As discussed, this session is going to be a unique two-presenter session. You will have chance to take a side with one speaker and stump the other speaker. Come early to decide which speaker you want to cheer during the session. Quiz and Goodies By now, you must have figured out that this session is going to be an extremely interactive session. We need your support through your active participation. We will have some really brain-twisting quiz line up just for you. You will have to take part and win surprises from us! Trust me. If you get it right, we will give you something which can help you learn more! We will have a quiz on Twitter as well. We will ask a question in person and you will be able to participate on Twitter. 10 – Demos As I said, both of us do not know what each other is going to present, but there are few things which we know very well. We have 10 demos and 6 slides. I think this is going to be an exciting demo marathon. Trust me, you will love it and the taste of this session will be in your mouth till the next TechEd. Session Details Title: SQL Server Misconceptions and Resolution – A Practical Perspective (Add to Calendar) Abstract: “The earth is flat”! – An ancient common misconception, which has been proven incorrect as we progressed in modern times. In this session, we will see various database misconceptions prevailing and their resolutions with the aid of the demos. In this unique session, the audience will be a part of the conversation and resolution. Date and Time: March 21, 2012, 15:15 to 16:15 Location: Hotel Lalit Ashok - Kumara Krupa High Grounds, Bengaluru – 560001, Karnataka, India. Add to Calendar Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Interview Questions and Answers, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Create an Alias Directory inside a Virtual Host

    - by Praveen Kumar
    First, let me say, I asked this question in StackOverflow, and thought I could get more replies here. I checked here, here, here, here, here, and here before asking this question. I guess my search skills are weak. I am using the WampServer version 2.2e. I have a need like, I need a virtual path inside a virtual host. Let me say the two hosts that I have. Primary Virtual Host (Localhost) NameVirtualHost *:80 <VirtualHost *:80> ServerName localhost DocumentRoot "C:/Wamp/www" </VirtualHost> My Apps Virtual Hosts <VirtualHost *:80> ServerName apps.ptrl DocumentRoot "C:/Wamp/vhosts/ptrl/apps" ErrorLog "logs/apps-ptrl-error.log" CustomLog "logs/apps-ptrl-access.log" common <Directory "C:/Wamp/vhosts/ptrl/apps"> allow from all order allow,deny AllowOverride All </Directory> DirectoryIndex index.html index.htm index.php </VirtualHost> My Blog Virtual Host <VirtualHost *:80> ServerName blog.praveen-kumar.ptrl DocumentRoot "C:/Wamp/vhosts/ptrl/praveen-kumar/blog" ErrorLog "logs/praveen-kumar-ptrl-error.log" CustomLog "logs/praveen-kumar-ptrl-access.log" common <Directory "C:/Wamp/vhosts/ptrl/praveen-kumar/blog"> allow from all order allow,deny AllowOverride All </Directory> DirectoryIndex index.html index.htm index.php </VirtualHost> My requirement now is to have http://apps.ptrl/blog/ and http://blog.praveen-kumar.ptrl/ should be the same directory. One thing I thought of is, moving the blog folder inside the apps folder, but it is connected with Git and other stuffs are there, so it is not possible to move the folder. So, I thought of creating an alias to the VirtualHost in this way: <VirtualHost *:80> ServerName apps.ptrl DocumentRoot "C:/Wamp/vhosts/ptrl/apps" ErrorLog "logs/apps-ptrl-error.log" CustomLog "logs/apps-ptrl-access.log" common <Directory "C:/Wamp/vhosts/ptrl/apps"> allow from all order allow,deny AllowOverride All </Directory> DirectoryIndex index.html index.htm index.php # The alias to the blog! Alias /blog "C:/Wamp/vhosts/ptrl/praveen-kumar/blog" <Directory "C:/Wamp/vhosts/ptrl/praveen-kumar/blog"> allow from all order allow,deny AllowOverride All </Directory> </VirtualHost> But when I tried to access http://apps.ptrl/blog, I am getting an Error 403 Forbidden page. Am I doing the right thing? If you need to look at the access log, and error log, they are here: # Access Log 127.0.0.1 - - [14/Oct/2012:09:53:11 +0530] "GET /blog HTTP/1.1" 403 206 127.0.0.1 - - [14/Oct/2012:09:53:11 +0530] "GET /favicon.ico HTTP/1.1" 404 209 127.0.0.1 - - [14/Oct/2012:09:53:53 +0530] "GET / HTTP/1.1" 200 6935 127.0.0.1 - - [14/Oct/2012:09:53:53 +0530] "GET /app/blog/thumb.png HTTP/1.1" 404 216 # Error Log [Sun Oct 14 09:53:11 2012] [error] [client 127.0.0.1] client denied by server configuration: C:/Wamp/vhosts/ptrl/praveen-kumar/blog [Sun Oct 14 09:53:11 2012] [error] [client 127.0.0.1] File does not exist: C:/Wamp/vhosts/ptrl/apps/favicon.ico [Sun Oct 14 09:53:53 2012] [error] [client 127.0.0.1] File does not exist: C:/Wamp/vhosts/ptrl/apps/app/blog, referer: http://apps.ptrl/ Waiting eagerly for some help. I am ready to provide more info, if needed. Update #1: Changed VirtualHosts: <VirtualHost *:80> ServerName apps.ptrl DocumentRoot "C:/Wamp/vhosts/ptrl/apps" ErrorLog "logs/apps-ptrl-error.log" CustomLog "logs/apps-ptrl-access.log" common # The alias to the blog! Alias /blog "C:/Wamp/vhosts/ptrl/praveen-kumar/blog" <Directory "C:/Wamp/vhosts/ptrl/praveen-kumar/blog"> allow from all order allow,deny AllowOverride All </Directory> <Directory "C:/Wamp/vhosts/ptrl/apps"> allow from all order allow,deny AllowOverride All </Directory> DirectoryIndex index.html index.htm index.php </VirtualHost> The issue now: I am able to access the site. The physical links are working now. i.e., I am able to open http://apps.ptrl/blog/index.php but not http://apps.ptrl/blog/view-1.ptf, which gets translated to http://apps.ptrl/blog/index.php?page=view&id=1. Any solutions?

    Read the article

  • SQL SERVER – A Picture is Worth a Thousand Words – A Collection of Inspiring and Funny Posts by Vinod Kumar

    - by pinaldave
    One of the most popular quotes is: A picture is worth a thousand words. Working on this concept I started a series over my blog called the “Picture Post”. Rather than rambling over tons of material over text, we are trying to give you a capsule mode of the blog in a quick glance. Some of the picture posts already available over my blog are: Correlation of Ego and Work: Ego and Pride most of the times become a hindrance when we work inside a team. Take this cue, the first ever Picture post was published. Simple and easy to understand concept. Would want to say, Ego is the biggest enemy to humans. Read Original Post. Success (Perception Vs Reality): Personally, have always thought success is not something the talented achieve with the opportunity presented to them, but success is developed using the opportunity in hand now. In this fast paced world where success is pre-defined and convoluted by metrics it is hard to understand how complex it can sometimes be. So I took a stab at this concept in a simple way. Read Original Post. Doing Vs Saying: As Einstein would describe, Insanity is doing the same thing over and over again and expecting different results. Given the amount of information we get, it is difficult to keep track, learn and implement the same. If you were ever reminded of your college days, there will always be 5-6 people doing different things and we naturally try to emulate what they are doing. This could be from competitive exams GMAT, GRE, CAT, Higher-Ed, B-School hunting etc. Rather than saying you are going to do, it is best to do and then say!!! Read Original Picture Post. Your View Vs Management View: Being in the corporate world can be really demanding and we keep asking this question – “Why me?” when the performance appraisal process ends. In this post I just want to ask you one frank opinion – “Are you really self-critical in your assessments?”. If that is the case there shouldn’t be any heartburns or surprises. If you had just one thing to take back, well forget what others are getting but invest time in making yourself better because that is going to take you longer and further in your career. Read Picture Post. Blogging lifecycle for majority: I am happy and fortunate to be in this blog post because this picture post surely doesn’t apply to SQLAuthority where consistency and persistence have been the hallmark of the blog. For the majority others, who have a tendency to start a blog, get into slumber for a while and write saying they want to get back to blogging, the picture post was specifically done for them. Paradox of being someone else: It is always a dream that we want to become somebody and in this process of doing so, we become nobody. In this constant tussle of lost identity we forget to enjoy the moment that is in front of us. I just depicted this using a simple analogy of our constant struggle to get to the other side, just to realize we missed the wonderful moments. Grass is not greener on the other side, but grass is greener where we water the surface. Read Picture Post. And on the lighter side… Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • issue in ObservableCollection

    - by prince23
    hi, i have an lsit with these data i have a class called information.cs with these properties name,school, parent ex data name school parent kumar fes All manju fes kumar anu frank kumar anitha jss All rohit frank manju anill vijaya manju vani jss kumar soumya jss kumar madhu jss rohit shiva jss rohit vanitha jss anitha anu jss anitha now taking this as an input i wanted the output to be formated with a Hierarchical data when parent is all means it is the topmost level kumar fes All. what i need to do here is i need to create an object[0] and then check in list whether kumar exists as a parent in the list if it exista then add those items as under the object[0] as a parent i need to create one more oject under **manju fes kumar anu frank kumar** if you see this class file its shows how data is formatted. public class SampleProjectData { public static ObservableCollection GetSampleData() { DateTime dtS = DateTime.Now; ObservableCollection<Product> teams = new ObservableCollection<Product>(); teams.Add(new Product() { PDName = "Product1", OverallStartTime = dtS, OverallEndTime = dtS + TimeSpan.FromDays(3), }); Project emp = new Project() { PName = "Project1", OverallStartTime = dtS + TimeSpan.FromDays(1), OverallEndTime = dtS + TimeSpan.FromDays(6) }; emp.Tasks.Add(new Task() { StartTime = dtS, EndTime = dtS + TimeSpan.FromDays(2), TaskName = "John's Task 3" }); emp.Tasks.Add(new Task() { StartTime = dtS + TimeSpan.FromDays(3), EndTime = dtS + TimeSpan.FromDays(4), TaskName = "John's Task 2" }); teams[0].Projects.Add(emp); } return teams; }

    Read the article

  • SQLAuthority News – A Successful Community TechDays at Ahmedabad – December 11, 2010

    - by pinaldave
    We recently had one of the best community events in Ahmedabad. We were fortunate that we had SQL Experts from around the world to have presented at this event. This gathering was very special because besides Jacob Sebastian and myself, we had two other speakers traveling all the way from Florida (Rushabh Mehta) and Bangalore (Vinod Kumar).There were a total of nearly 170 attendees and the event was blast. Here are the details of the event. Pinal Dave Presenting at Community Tech Days On the day of the event, it seemed to be the coldest day in Ahmedabad but I was glad to see hundreds of people waiting for the doors to be opened some hours before. We started the day with hot coffee and cookies. Yes, food first; and it was right after my keynote. I could clearly see that the coffee did some magic right away; the hall was almost full after the coffee break. Jacob Sebastian Presenting at Community Tech Days Jacob Sebastian, an SQL Server MVP and a close friend of mine, had an unusual job of surprising everybody with an innovative topic accompanied with lots of question-and-answer portions. That’s definitely one thing to love Jacob, that is, the novelty of the subject. His presentation was entitled “Best Database Practices for the .Net”; it really created magic on the crowd. Pinal Dave Presenting at Community Tech Days Next to Jacob Sebastian, I presented “Best Database Practices for the SharePoint”. It was really fun to present Database with the perspective of the database itself. The main highlight of my presentation was when I talked about how one can speed up the database performance by 40% for SharePoint in just 40 seconds. It was fun because the most important thing was to convince people to use the recommendation as soon as they walk out of the session. It was really amusing and the response of the participants was remarkable. Pinal Dave Presenting at Community Tech Days My session was followed by the most-awaited session of the day: that of Rushabh Mehta. He is an international BI expert who traveled all the way from Florida to present “Self Service BI” session. This session was funny and truly interesting. In fact, no one knew BI could be this much entertaining and fascinating. Rushabh has an appealing style of presenting the session; he instantly got very much interaction from the audience. Rushabh Mehta Presenting at Community Tech Days We had a networking lunch break in-between, when we talked about many various topics. It is always interesting to get in touch with the Community and feel a part of it. I had a wonderful time during the break. Vinod Kumar Presenting at Community Tech Days After lunch was apparently the most difficult session for the presenter as during this time, many people started to fall sleep and get dizzy. This spot was requested by Microsoft SQL Server Evangelist Vinod Kumar himself. During our discussion he suggested that if he gets this slot he would make sure people are up and more interactive than during the morning session. Just like always, this session was one of the best sessions ever. Vinod is true to his word as he presented the subject of “Time Management for Developer”. This session was the biggest hit in the event because the subject was instilled in the mind of every participant. Vinod Kumar Presenting at Community Tech Days Vinod’s session was followed by his own small session. Due to “insistent public demand”, he presented an interesting subject, “Tricks and Tips of SQL Server“. In 20 minutes he has done another awesome job and all attendees wanted more of the tricks. Just as usual he promised to do that next time for us. Vinod’s session was succeeded by Prabhjot Singh Bakshi’s session. He presented an appealing Silverlight concept. Just the same, he did a great job and people cheered him. Prabhjot Presenting at Community Tech Days We had a special invited speaker, Dhananjay Kumar, traveling all the way from Pune. He always supports our cause to help the Community in empowering participants. He presented the topic about Win7 Mobile and SharePoint integration. This was something many did not even expect to be possible. Kudos to Dhananjay for doing a great job. Dhananjay Kumar Presenting at Community Tech Days All in all, this event was one of the best in the Community Tech Days series in Ahmedabad. We were fortunate that legends from the all over the world were present here to present to the Community. I’d say never underestimate the power of the Community and its influence over the direction of the technology. Vinod Kumar Presenting trophy to Pinal Dave Vinod Kumar Presenting trophy to Pinal Dave This event was a very special gathering to me personally because of your support to the vibrant Community. The following awards were won for last year’s performance: Ahmedabad SQL Server User Group (President: Jacob Sebastian; Leader: Pinal Dave) – Best Tier 2 User Group Best Development Community Individual Contributor – Pinal Dave Speakers I was very glad to receive the award for our entire Community. Attendees at Community Tech Days I want to say thanks to Rushabh Mehta, Vinod Kumar and Dhananjay Kumar for visiting the city and presenting various technology topics in Community Tech Days. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: MVP, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology

    Read the article

  • Using django-haystack, how do I perform a search with only partial terms?

    - by Sri Raghavan
    I've got a Haystack/xapian search index for django.contrib.auth.models.User. The template is simply {{object.get_full_name}} as I intend for a user to type in a name and be able to search for it. My issue is this: if I search, say, Sri (my full first name) I come up with a result for the user object pertaining to my name. However, if I search Sri Ragh - that is, my full name, and part of my last name, I get no results. How can I set Haystack up so that I can get the appropriate results for partial queries? (I essentially want it to search *Sri Ragh*, but I don't know if wildcards would actually do the trick, or how to implement them). This is my search query: results = SearchQuerySet().filter(content='Sri Ragh')

    Read the article

  • SQLAuthority News – SQL Server Technology Evangelists and Evangelism

    - by pinaldave
    This is the exact conversation that I had with three people during the recent SQL Server Public Training. Person 1: “Are you an SQL Server Evangelist?” Pinal : “No, but Vinod Kumar is.” Person 1: “Who are you?” Person 2: “He is Pinal, haha!” Person 1: “I know that, but don’t you evangelize SQL Server Technology?” Pinal : “Hmm… I do that…” Person 1: “In that case, why don’t you call yourself an Evangelist?” Pinal : “…! …” Person 2: “Good Question! Who are you Pinal?” Pinal : “I think you are asking my title, is that correct?” Person 1: “Maybe.” Pinal : “I am a Mentor, and I work for Solid Quality Mentors.” Person 2: “I have seen you listing yourself as the Founder of SQLAuthority.com… so…” Pinal : “Yeah that’s true.” Person 3: “Let me summarize what these people are asking. What they are asking is that you can have multiple titles, so is being an evangelist one of your titles or not?” Pinal : “Well, I am an SQL Server MVP and lots of people say that we are also evangelists of technology. In fact,  we are all evangelists of technology, aren’t we?” Person 1: “So let me come back to my original topic: If you are an SQL Server Evangelist, then what is this evangelism?” Person 2: “And who is Vinod Kumar – I have heard about him a lot.” Pinal : “Oh okay. Now I got it. Let me explain …” The answer was quite long but since this conversation, I have been thinking about the words “evangelist” and “evangelism.” I think being an evangelist is one of the most respected jobs in the world and to do this job one must bear lots of responsibilities. There were two questions asked to me, so let me answer both one by one. Who is Vinod Kumar? Vinod Kumar is a Technology Evangelist for Microsoft and one of the most respected persons in the SQL Server Community in India. Let me copy-paste my note from the previous TechEd India 2010 article. “I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller so there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod was the best speaker in the event. He did not have a single time that disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL.” Pinal Dave and Vinod Kumar What is Technology Evangelism? Here I am listing three posts written by Vinod Kumar, wherein he talks about Technology Evangelism and Technology Evangelist in an in-depth manner. They are highly-regarded articles in the Community. Evangelism beyond boundaries with an Evangelists !!! Technology Evangelism Demystified New face of Online Technology Evangelism I strongly recommend reading them all. These are wonderful blog posts. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • SQL SERVER – Finding Shortest Distance between Two Shapes using Spatial Data Classes – Ramsetu or Adam’s Bridge

    - by pinaldave
    Recently I was reading excellent blog post by Lenni Lobel on Spatial Database. He has written very interesting function ShortestLineTo in Spatial Data Classes. I really loved this new feature of the finding shortest distance between two shapes in SQL Server. Following is the example which is same as Lenni talk on his blog article . DECLARE @Shape1 geometry = 'POLYGON ((-20 -30, -3 -26, 14 -28, 20 -40, -20 -30))' DECLARE @Shape2 geometry = 'POLYGON ((-18 -20, 0 -10, 4 -12, 10 -20, 2 -22, -18 -20))' SELECT @Shape1 UNION ALL SELECT @Shape2 UNION ALL SELECT @Shape1.ShortestLineTo(@Shape2).STBuffer(.25) GO When you run this script SQL Server finds out the shortest distance between two shapes and draws the line. We are using STBuffer so we can see the connecting line clearly. Now let us modify one of the object and then we see how the connecting shortest line works. DECLARE @Shape1 geometry = 'POLYGON ((-20 -30, -3 -30, 14 -28, 20 -40, -20 -30))' DECLARE @Shape2 geometry = 'POLYGON ((-18 -20, 0 -10, 4 -12, 10 -20, 2 -22, -18 -20))' SELECT @Shape1 UNION ALL SELECT @Shape2 UNION ALL SELECT @Shape1.ShortestLineTo(@Shape2).STBuffer(.25) GO Now once again let us modify one of the script and see how the shortest line to works. DECLARE @Shape1 geometry = 'POLYGON ((-20 -30, -3 -30, 14 -28, 20 -40, -20 -30))' DECLARE @Shape2 geometry = 'POLYGON ((-18 -20, 0 -10, 4 -12, 10 -20, 2 -18, -18 -20))' SELECT @Shape1 UNION ALL SELECT @Shape2 UNION ALL SELECT @Shape1.ShortestLineTo(@Shape2).STBuffer(.25) SELECT @Shape1.STDistance(@Shape2) GO You can see as the objects are changing the shortest lines are moving at appropriate place. I think even though this is very small feature this is really cool know. While I was working on this example, I suddenly thought about distance between Sri Lanka and India. The distance is very short infect it is less than 30 km by sea. I decided to map India and Sri Lanka using spatial data classes. To my surprise the plotted shortest line is the same as Adam’s Bridge or Ramsetu. Adam’s Bridge starts as chain of shoals from the Dhanushkodi tip of India’s Pamban Island and ends at Sri Lanka’s Mannar Island. Geological evidence suggests that this bridge is a former land connection between India and Sri Lanka. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Spatial Database, SQL Spatial

    Read the article

  • Microsoft TechEd 2010 - Day 3 @ Bangalore

    - by sathya
    Microsoft TechEd 2010 - Day 3 @ Bangalore Sorry for my delayed post on day 3 because I had to travel from Blore to Chennai So I couldnt write for the past two days. On day 3 as usual we had lot of simultaneous tracks on various sessions. This day I choose the Your Data, Our Platform Track. It had sessions on the following 5 topics :   Developing Data-tier Applications in Visual Studio 2010 - by Sanjay Nagamangalam SQL Server Query Optimization, Execution and Debugging Query Performance - by Vinod Kumar M SQL Server Utility - Its about more than 1 SQL Server - by Vinod Kumar Jagannathan Data Recovery / Consistency with CheckDB - by Vinod Kumar M Developing with SQL Server Spatial and Deep dive into Spatial Indexing - by Pinal Dave Developing Data-tier Applications in Visual Studio 2010 - by Sanjay Nagamangalam This was one of the superb sessions i have attended. He explained all the concepts in detail with a demo. The important thing in this is there is something called Data-Tier application project which is newly introduced in this VS2010 with which we can manage all our data along with our application inside our VS itself. We can create DB,Tables,Procs,Views etc. here itself and once we deploy it creates a compressed file called .dacpac which stores all the changes in Table Schema,Created procs, etc. on to that single file which reduces our (developer's) effort in preparing the deployment scripts and giving it to the DBA. It also has some policy configurations which can be managed easily by checking some rules like in outlook. For Ex : IF the SQL Server Version > 10 then deploy else dont. This rule specifies that even if we try to deploy on SQL Server DB with version less than 10 It will not do it. And if we deploy some .dacpac to SQL server production db with the option upgrade DB with this dacpac once everything completes successfully it will say success else it rollsback to the prior version. Even if it gets deployed successfully and later @ a point of time you wish to revert it back to the prior version, you can go ahead and delete the existing dacpac version so that it reverts to the older version of the db changes. And for the good questions that were asked in the session T-Shirts were given. SQL Server Query Optimization, Execution and Debugging Query Performance - by Vinod Kumar M This one too was the best session. The speaker Vinod explained everything very much clearly. This was really useful session and you dont believe, as per my knowledge, in the total 3 days in the TechEd except the Keynote, for this session seats were full (House FULL)  People were even standing out to attend this session. Such a great one it was. The speaker did a deep dive in to the Query Plan section and showed which actually causes the problem. Its all about the thing that we need to understand about the execution of SQL server Queries. We think in a way and SQL Server never executes in that way. We need to understand that first. He also told about there might be two plans generated for a single query at a point of time because of parallel processors in the system. The Key is here in every query. There is something called Estimated Row Count and Actual Row Count in the query plan. If the estimated row count by SQL server tallies with the actual row count your performance will be awesome. He said some tweaks to achieve the same. After this as usual we had lunch SQL Server Utility - Its about more than 1 SQL Server - by Vinod Kumar Jagannathan This was more of a DBA's session. Am really sorry I was totally blank and I was not interested to attend this session and walked out to attend Migrating to the cloud by Harish Ranganathan (My favorite Speaker) but unfortunately that was some other persons session. There the speaker was telling about how to configure the connection strings in such a way that we can connect to the SQL Azure platform from our VS and also showed us how to deploy the same in to Windows Azure. In between there were lot of technical problems like laptop hang, user locked and he was switching between systems, also i came in the half so i wasnt able to listen that fully. In between, Since I got an MCTS certification they gave me T-Shirt with the lines 'Iam Certified. Are you?' and they asked me to wear that. If we wear that we might get spotted and they would give us some goodies  So on the 3rd day I was wearing that T-Shirt. I got spotted by the person Tarun who was coordinating things about the certification, and he was accompanied with a cameraman and they interviewed me about the certification and I was shown live in the Teched and was seen by 60000 live viewers of the TechEd. I was really happy on that. Data Recovery / Consistency with CheckDB - by Vinod Kumar M This was one of the best sessions too in the TechEd. This guy is really amazing. In front of us he crashed a DB and showed how to recover the same in 6 different ways for different no of failures. Showed about Different types of error msgs like : 823,824,825 msdb..suspect_pages DBCC CheckDB (different parameters to it) I am really waiting for his session to get uploaded live in the Teched Website. Here is his contact info If you wish to connect to him : Twitter : @vinodk_sql Website : www.ExtremeExperts.com Blog : http://blogs.sqlxml.org/vinodkumar Developing with SQL Server Spatial and Deep dive into Spatial Indexing - by Pinal Dave Pinal Dave is a King in SQL and he is a SQL MVP and he is the owner of SQLAuthority.com He took the session on Spatial Databases from the start. Showed about the different types of Spatial : Geometric and Geographic Geometric : x and y axis its a planar surface Geographic : Spherical surface with 3600  as the maximum which is used to represent the geographic points on the earth and easy to draw maps of different kinds. He had a lot of obstacles during his session like rain coming inside the hall, mic wires got bursted due to rain, Videos off on the display screens. In spite of that he asked the audience to come in the front rows and managed to take a good session without ppts and finally we got the displays on and he was showing demos on the same what he explained orally. That was really a fun filled informative session. He gave some books for the persons who asked good questions and answered well for his questions and I got one too  (It was a book on Data Mining - Wrox Publishers) And finally after all these things there was Keynote session for close of the TechEd. and we all assembled in a big hall where Mr.Ashok Soota, a man of age around 70  co-founder of Mindtree was called to give some lecture on his successes. He was explaining about his past and what all companies he switched and for what reasons and what are all his successes and what are all his failures and the learnings of him from his past failures. and his success and failures on his partnerships with the other concern. And there were some questions for him like What is your suggestion on young entrepreneur? How did you learn from past failures? What is reiterating your success? What is your suggestion on partnerships? How to choose partnerships? etc. And they said @ 7.30 Pm there would be a party night, but unfortunately i was not able to attend that because I had to catch my train and before that i had to pack things, so I started @ 7 itself. Thats it about the TechED!!! Stay tuned for further Technology updates.

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >