Search Results

Search found 28223 results on 1129 pages for 'view controller'.

Page 1097/1129 | < Previous Page | 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104  | Next Page >

  • Company Review: Google Products

    Google, Inc offers an array of products and services to all of its end-users. However their search capabilities are the foundation for Google’s current success and their primary business focus. Currently, Google offers over twenty different search applications that allow users to search the internet for books, maps, videos, images, products and much more. Their product decisions have allowed users demands to be met while focusing on the free based model. This allows users to access Google data free of charge and indirectly gives Google a strong competitive advantage of other competitors along with the accuracy of the search results. According to Google, Inc, they offer the following types of searching capabilities: Alerts Get email updates on the topics of your choice Blog Search Find blogs on your favorite topics  Books Search the full text of books  Custom Search Create a customized search experience for your community  Desktop Search and personalize your computer  Dictionary Search for definitions of words and phrases Directory Search the web, organized by topic or category Earth Explore the world from your computer Finance Business info, news and interactive charts GOOG-411 Find and connect for free with businesses from your phone  Images Search for images on the web Maps View maps and directions News Search thousands of news stories Patent Search Search the full text of US Patents Product Search Search for stuff to buy Scholar Search scholarly papers Toolbar Add a search box to your browser Trends Explore past and present search trends Videos Search for videos on the web Web Search Search billions of web pages Web Search Features Find movies, music, stocks, books and more mapping Google’s free based business model is only one way it differentiates itself from its competition. There is also a strong focus on the accuracy of search results and the speed in which they are returned to the end-user. Quality function deployment (QFD) is a structured method used to help connect user needs to the design features of a project proposed to address those needs. This method is particularly useful in accounting for needs that are not easily articulated or precisely defined according to the U. S. Department of Transportation Federal Highway Administration. Due to the fact that QFD is so customer driven Google is always in a constant state of change in attempt to reengineer its search algorithms, and other dependant systems so that end-users requirements are constantly being met. Value engineering is a key example of this, Google is constantly trying to improve all aspects of its products, improve system maintainability, and system interoperability. Bridgefield Group defines value engineering as an organized methodology that identifies and selects the lowest lifecycle cost options in design, materials and processes that achieves the desired level of performance, reliability and customer satisfaction. In addition, it seeks to remove unnecessary costs in the above areas and is often a joint effort with cross-functional internal teams and relevant suppliers. Common issues that appear when developing large scale systems like Google’s search applications include modular design of a product and/or service and providing accurate value analysis. A design approach that adheres to four fundamental tenets of cohesiveness, encapsulation, self-containment, and high binding to design a system component as an independently operable unit subject to change is how the Open System Joint Task Force defines modular design. More specifically M. S. Schmaltz defines modular software design as having a large collection of statements strung together in one partition of in-line code; we segment or divide the statements into logical groups called modules. Each module performs one or two tasks, and then passes control to another module. By breaking up the code into "bite-sized chunks", so to speak, we are able to better control the flow of data and control. This is especially true in large software systems. Value analysis is a process to evaluate products and services based on effectiveness, safety, and cost. Value analysis involves assessing the quality as well as the cost of a product or service as defined by the Healthcare Financial Management Association.  “Operations Management deals with the design and management of products, processes, services and supply chains. It considers the acquisition, development, and utilization of resources that firms need to deliver the goods and services their clients want.” (MIT,2010) Google, Inc encourages an open environment between all employees, also known as Googlers. This is reinforced by a cross-section team or cross-functional teams comprised from multiple departments assigned to every project so that every department like marketing, finance, and quality assurance has input on every project. In addition, Google is known for their openness to new ideas regardless of the status or seniority of an employee. In fact, Google allows for 20% of an employee’s time can be devoted to developing new ideas and/or pet projects. HumTech.com defines a cross-functional team as a collection of people with varied levels of skills and experience brought together to accomplish a task. As the name implies, Cross-Functional Team members come from different organizational units. Cross-Functional Teams may be permanent or ad hoc. Google’s search application product strategy primarily focuses on mass customization. This is allows Google to create a base search application and allows results to be returned to the end-users quickly based on specific parameters and search settings. In addition, they also store the data that is returned in case other desire the same results based on other end-users supplying the same customized settings. This allows Google to appear to render search results in virtually real-time to the user while allowing for complete customization of the searching criteria. Greg Vogl, a professor at Uganda Martyrs University, defines mass customization as when a business gives its customers the opportunity to tailor its products or services to the customer's specifications. The IT staff at Google play a key role in ensuring that the search application’s product strategy is maintained simply because the IT staff designs, develops, and maintains all of their proprietary applications. In fact, they also maintain all network infrastructure to ensure that it is available to all end-users. References: http://www.google.com/intl/en/options/ http://ops.fhwa.dot.gov/freight/publications/ftat_user_guide/sec5.htm http://www.bridgefieldgroup.com/bridgefieldgroup/glos9.htm#V http://www.acq.osd.mil/osjtf/termsdef.html http://www.cise.ufl.edu/~mssz/Pascal-CGS2462/prog-dsn.html http://www.hfma.org/publications/business_caring_newsletter/exclusives/Supply+and+Inventory+Terms+Defined.htm http://mitsloan.mit.edu/omg/om-definition.php http://www.humtech.com/opm/grtl/ols/ols3.cfm http://www.gregvogl.net/courses/mis1/glossary.htm

    Read the article

  • How do you setup the driver for a Philips based TV capture card?

    - by user8270
    Hi, I have a TV card that I have not managed to install with Ubuntu 10.10 i386. I have tried various topics in various forums and I could not install it. I hope you can help me to install it thank you. lspci 01:07.0 Multimedia controller: Philips Semiconductors SAA7130 Video Broadcast Decoder (rev 01) dmesg [10299.516344] saa7134 ALSA driver for DMA sound unloaded [11385.340661] Linux video capture interface: v2.00 [11385.384278] saa7130/34: v4l2 driver version 0.2.16 loaded [11385.384390] saa7130[0]: found at 0000:01:07.0, rev: 1, irq: 17, latency: 32, mmio: 0x0 [11385.384403] saa7130[0]: subsystem: 1131:0000, board: LifeView/Typhoon FlyVIDEO2000 [card=3,insmod option] [11385.384412] saa7130[0]: can't get MMIO memory @ 0x0 [11385.384431] saa7134: probe of 0000:01:07.0 failed with error -16 [11385.401174] saa7134 ALSA driver for DMA sound loaded [11385.401182] saa7134 ALSA: no saa7134 cards found [11477.797019] tvtime[12534]: segfault at 6b0 ip 0804cf64 sp bf928a4c error 4 in tvtime[8048000+76000] [11626.141821] tvtime[12549]: segfault at 6b0 ip 0804cf64 sp bfec357c error 4 in tvtime[8048000+76000] [12218.120632] saa7134 ALSA driver for DMA sound unloaded [12464.993061] Linux video capture interface: v2.00 [12465.028285] saa7130/34: v4l2 driver version 0.2.16 loaded [12465.028392] saa7130[0]: found at 0000:01:07.0, rev: 1, irq: 17, latency: 32, mmio: 0x0 [12465.028404] saa7134: <rant> [12465.028406] saa7134: Congratulations! Your TV card vendor saved a few [12465.028408] saa7134: cents for a eeprom, thus your pci board has no [12465.028411] saa7134: subsystem ID and I can't identify it automatically [12465.028414] saa7134: </rant> [12465.028416] saa7134: I feel better now. Ok, here are the good news: [12465.028418] saa7134: You can use the card=<nr> insmod option to specify [12465.028421] saa7134: which board do you have. The list: [12465.028428] saa7134: card=0 -> UNKNOWN/GENERIC [12465.028435] saa7134: card=1 -> Proteus Pro [philips reference design] 1131:2001 1131:2001 [12465.028447] saa7134: card=2 -> LifeView FlyVIDEO3000 5168:0138 4e42:0138 [12465.028457] saa7134: card=3 -> LifeView/Typhoon FlyVIDEO2000 5168:0138 4e42:0138 [12465.028467] saa7134: card=4 -> EMPRESS 1131:6752 [12465.028475] saa7134: card=5 -> SKNet Monster TV 1131:4e85 [12465.028484] saa7134: card=6 -> Tevion MD 9717 [12465.028491] saa7134: card=7 -> KNC One TV-Station RDS / Typhoon TV Tune 1131:fe01 1894:fe01 [12465.028501] saa7134: card=8 -> Terratec Cinergy 400 TV 153b:1142 [12465.028510] saa7134: card=9 -> Medion 5044 [12465.028517] saa7134: card=10 -> Kworld/KuroutoShikou SAA7130-TVPCI [12465.028523] saa7134: card=11 -> Terratec Cinergy 600 TV 153b:1143 [12465.028532] saa7134: card=12 -> Medion 7134 16be:0003 16be:5000 [12465.028542] saa7134: card=13 -> Typhoon TV+Radio 90031 [12465.028548] saa7134: card=14 -> ELSA EX-VISION 300TV 1048:226b [12465.028557] saa7134: card=15 -> ELSA EX-VISION 500TV 1048:226a [12465.028565] saa7134: card=16 -> ASUS TV-FM 7134 1043:4842 1043:4830 1043:4840 [12465.028576] saa7134: card=17 -> AOPEN VA1000 POWER 1131:7133 [12465.028585] saa7134: card=18 -> BMK MPEX No Tuner [12465.028592] saa7134: card=19 -> Compro VideoMate TV 185b:c100 [12465.028600] saa7134: card=20 -> Matrox CronosPlus 102b:48d0 [12465.028608] saa7134: card=21 -> 10MOONS PCI TV CAPTURE CARD 1131:2001 [12465.028617] saa7134: card=22 -> AverMedia M156 / Medion 2819 1461:a70b [12465.028625] saa7134: card=23 -> BMK MPEX Tuner [12465.028632] saa7134: card=24 -> KNC One TV-Station DVR 1894:a006 [12465.028640] saa7134: card=25 -> ASUS TV-FM 7133 1043:4843 [12465.028648] saa7134: card=26 -> Pinnacle PCTV Stereo (saa7134) 11bd:002b [12465.028657] saa7134: card=27 -> Manli MuchTV M-TV002 [12465.028663] saa7134: card=28 -> Manli MuchTV M-TV001 [12465.028670] saa7134: card=29 -> Nagase Sangyo TransGear 3000TV 1461:050c [12465.028679] saa7134: card=30 -> Elitegroup ECS TVP3XP FM1216 Tuner Card( 1019:4cb4 [12465.028687] saa7134: card=31 -> Elitegroup ECS TVP3XP FM1236 Tuner Card 1019:4cb5 [12465.028695] saa7134: card=32 -> AVACS SmartTV [12465.028702] saa7134: card=33 -> AVerMedia DVD EZMaker 1461:10ff [12465.028710] saa7134: card=34 -> Noval Prime TV 7133 [12465.028717] saa7134: card=35 -> AverMedia AverTV Studio 305 1461:2115 [12465.028725] saa7134: card=36 -> UPMOST PURPLE TV 12ab:0800 [12465.028734] saa7134: card=37 -> Items MuchTV Plus / IT-005 [12465.028740] saa7134: card=38 -> Terratec Cinergy 200 TV 153b:1152 [12465.028749] saa7134: card=39 -> LifeView FlyTV Platinum Mini 5168:0212 4e42:0212 5169:1502 [12465.028760] saa7134: card=40 -> Compro VideoMate TV PVR/FM 185b:c100 [12465.028768] saa7134: card=41 -> Compro VideoMate TV Gold+ 185b:c100 [12465.028776] saa7134: card=42 -> Sabrent SBT-TVFM (saa7130) [12465.028783] saa7134: card=43 -> :Zolid Xpert TV7134 [12465.028790] saa7134: card=44 -> Empire PCI TV-Radio LE [12465.028796] saa7134: card=45 -> Avermedia AVerTV Studio 307 1461:9715 [12465.028805] saa7134: card=46 -> AVerMedia Cardbus TV/Radio (E500) 1461:d6ee [12465.028813] saa7134: card=47 -> Terratec Cinergy 400 mobile 153b:1162 [12465.028821] saa7134: card=48 -> Terratec Cinergy 600 TV MK3 153b:1158 [12465.028830] saa7134: card=49 -> Compro VideoMate Gold+ Pal 185b:c200 [12465.028838] saa7134: card=50 -> Pinnacle PCTV 300i DVB-T + PAL 11bd:002d [12465.028847] saa7134: card=51 -> ProVideo PV952 1540:9524 [12465.028855] saa7134: card=52 -> AverMedia AverTV/305 1461:2108 [12465.028863] saa7134: card=53 -> ASUS TV-FM 7135 1043:4845 [12465.028871] saa7134: card=54 -> LifeView FlyTV Platinum FM / Gold 5168:0214 5168:5214 1489:0214 5168:0304 [12465.028884] saa7134: card=55 -> LifeView FlyDVB-T DUO / MSI TV@nywhere D 5168:0306 4e42:0306 [12465.028894] saa7134: card=56 -> Avermedia AVerTV 307 1461:a70a [12465.028903] saa7134: card=57 -> Avermedia AVerTV GO 007 FM 1461:f31f [12465.028911] saa7134: card=58 -> ADS Tech Instant TV (saa7135) 1421:0350 1421:0351 1421:0370 1421:1370 [12465.028924] saa7134: card=59 -> Kworld/Tevion V-Stream Xpert TV PVR7134 [12465.028931] saa7134: card=60 -> LifeView/Typhoon/Genius FlyDVB-T Duo Car 5168:0502 4e42:0502 1489:0502 [12465.028942] saa7134: card=61 -> Philips TOUGH DVB-T reference design 1131:2004 [12465.028951] saa7134: card=62 -> Compro VideoMate TV Gold+II [12465.028958] saa7134: card=63 -> Kworld Xpert TV PVR7134 [12465.028964] saa7134: card=64 -> FlyTV mini Asus Digimatrix 1043:0210 [12465.028973] saa7134: card=65 -> V-Stream Studio TV Terminator [12465.028980] saa7134: card=66 -> Yuan TUN-900 (saa7135) [12465.028986] saa7134: card=67 -> Beholder BeholdTV 409 FM 0000:4091 [12465.028995] saa7134: card=68 -> GoTView 7135 PCI 5456:7135 [12465.029003] saa7134: card=69 -> Philips EUROPA V3 reference design 1131:2004 [12465.029011] saa7134: card=70 -> Compro Videomate DVB-T300 185b:c900 [12465.029020] saa7134: card=71 -> Compro Videomate DVB-T200 185b:c901 [12465.029028] saa7134: card=72 -> RTD Embedded Technologies VFG7350 1435:7350 [12465.029036] saa7134: card=73 -> RTD Embedded Technologies VFG7330 1435:7330 [12465.029045] saa7134: card=74 -> LifeView FlyTV Platinum Mini2 14c0:1212 [12465.029053] saa7134: card=75 -> AVerMedia AVerTVHD MCE A180 1461:1044 [12465.029062] saa7134: card=76 -> SKNet MonsterTV Mobile 1131:4ee9 [12465.029070] saa7134: card=77 -> Pinnacle PCTV 40i/50i/110i (saa7133) 11bd:002e [12465.029078] saa7134: card=78 -> ASUSTeK P7131 Dual 1043:4862 [12465.029087] saa7134: card=79 -> Sedna/MuchTV PC TV Cardbus TV/Radio (ITO [12465.029094] saa7134: card=80 -> ASUS Digimatrix TV 1043:0210 [12465.029102] saa7134: card=81 -> Philips Tiger reference design 1131:2018 [12465.029110] saa7134: card=82 -> MSI TV@Anywhere plus 1462:6231 1462:8624 [12465.029120] saa7134: card=83 -> Terratec Cinergy 250 PCI TV 153b:1160 [12465.029128] saa7134: card=84 -> LifeView FlyDVB Trio 5168:0319 [12465.029137] saa7134: card=85 -> AverTV DVB-T 777 1461:2c05 1461:2c05 [12465.029147] saa7134: card=86 -> LifeView FlyDVB-T / Genius VideoWonder D 5168:0301 1489:0301 [12465.029156] saa7134: card=87 -> ADS Instant TV Duo Cardbus PTV331 0331:1421 [12465.029165] saa7134: card=88 -> Tevion/KWorld DVB-T 220RF 17de:7201 [12465.029173] saa7134: card=89 -> ELSA EX-VISION 700TV 1048:226c [12465.029182] saa7134: card=90 -> Kworld ATSC110/115 17de:7350 17de:7352 [12465.029191] saa7134: card=91 -> AVerMedia A169 B 1461:7360 [12465.029200] saa7134: card=92 -> AVerMedia A169 B1 1461:6360 [12465.029208] saa7134: card=93 -> Medion 7134 Bridge #2 16be:0005 [12465.029216] saa7134: card=94 -> LifeView FlyDVB-T Hybrid Cardbus/MSI TV 5168:3306 5168:3502 5168:3307 4e42:3502 [12465.029229] saa7134: card=95 -> LifeView FlyVIDEO3000 (NTSC) 5169:0138 [12465.029238] saa7134: card=96 -> Medion Md8800 Quadro 16be:0007 16be:0008 16be:000d [12465.029249] saa7134: card=97 -> LifeView FlyDVB-S /Acorp TV134DS 5168:0300 4e42:0300 [12465.029259] saa7134: card=98 -> Proteus Pro 2309 0919:2003 [12465.029267] saa7134: card=99 -> AVerMedia TV Hybrid A16AR 1461:2c00 [12465.029276] saa7134: card=100 -> Asus Europa2 OEM 1043:4860 [12465.029284] saa7134: card=101 -> Pinnacle PCTV 310i 11bd:002f [12465.029293] saa7134: card=102 -> Avermedia AVerTV Studio 507 1461:9715 [12465.029301] saa7134: card=103 -> Compro Videomate DVB-T200A [12465.029308] saa7134: card=104 -> Hauppauge WinTV-HVR1110 DVB-T/Hybrid 0070:6700 0070:6701 0070:6702 0070:6703 0070:6704 0070:6705 [12465.029324] saa7134: card=105 -> Terratec Cinergy HT PCMCIA 153b:1172 [12465.029332] saa7134: card=106 -> Encore ENLTV 1131:2342 1131:2341 3016:2344 [12465.029344] saa7134: card=107 -> Encore ENLTV-FM 1131:230f [12465.029352] saa7134: card=108 -> Terratec Cinergy HT PCI 153b:1175 [12465.029360] saa7134: card=109 -> Philips Tiger - S Reference design [12465.029367] saa7134: card=110 -> Avermedia M102 1461:f31e [12465.029375] saa7134: card=111 -> ASUS P7131 4871 1043:4871 [12465.029384] saa7134: card=112 -> ASUSTeK P7131 Hybrid 1043:4876 [12465.029392] saa7134: card=113 -> Elitegroup ECS TVP3XP FM1246 Tuner Card 1019:4cb6 [12465.029401] saa7134: card=114 -> KWorld DVB-T 210 17de:7250 [12465.029409] saa7134: card=115 -> Sabrent PCMCIA TV-PCB05 0919:2003 [12465.029418] saa7134: card=116 -> 10MOONS TM300 TV Card 1131:2304 [12465.029426] saa7134: card=117 -> Avermedia Super 007 1461:f01d [12465.029435] saa7134: card=118 -> Beholder BeholdTV 401 0000:4016 [12465.029443] saa7134: card=119 -> Beholder BeholdTV 403 0000:4036 [12465.029451] saa7134: card=120 -> Beholder BeholdTV 403 FM 0000:4037 [12465.029459] saa7134: card=121 -> Beholder BeholdTV 405 0000:4050 [12465.029468] saa7134: card=122 -> Beholder BeholdTV 405 FM 0000:4051 [12465.029476] saa7134: card=123 -> Beholder BeholdTV 407 0000:4070 [12465.029484] saa7134: card=124 -> Beholder BeholdTV 407 FM 0000:4071 [12465.029493] saa7134: card=125 -> Beholder BeholdTV 409 0000:4090 [12465.029501] saa7134: card=126 -> Beholder BeholdTV 505 FM 5ace:5050 [12465.029510] saa7134: card=127 -> Beholder BeholdTV 507 FM / BeholdTV 509 5ace:5070 5ace:5090 [12465.029520] saa7134: card=128 -> Beholder BeholdTV Columbus TVFM 0000:5201 [12465.029528] saa7134: card=129 -> Beholder BeholdTV 607 FM 5ace:6070 [12465.029537] saa7134: card=130 -> Beholder BeholdTV M6 5ace:6190 [12465.029545] saa7134: card=131 -> Twinhan Hybrid DTV-DVB 3056 PCI 1822:0022 [12465.029554] saa7134: card=132 -> Genius TVGO AM11MCE [12465.029560] saa7134: card=133 -> NXP Snake DVB-S reference design [12465.029567] saa7134: card=134 -> Medion/Creatix CTX953 Hybrid 16be:0010 [12465.029576] saa7134: card=135 -> MSI TV@nywhere A/D v1.1 1462:8625 [12465.029584] saa7134: card=136 -> AVerMedia Cardbus TV/Radio (E506R) 1461:f436 [12465.029592] saa7134: card=137 -> AVerMedia Hybrid TV/Radio (A16D) 1461:f936 [12465.029601] saa7134: card=138 -> Avermedia M115 1461:a836 [12465.029609] saa7134: card=139 -> Compro VideoMate T750 185b:c900 [12465.029617] saa7134: card=140 -> Avermedia DVB-S Pro A700 1461:a7a1 [12465.029626] saa7134: card=141 -> Avermedia DVB-S Hybrid+FM A700 1461:a7a2 [12465.029634] saa7134: card=142 -> Beholder BeholdTV H6 5ace:6290 [12465.029642] saa7134: card=143 -> Beholder BeholdTV M63 5ace:6191 [12465.029651] saa7134: card=144 -> Beholder BeholdTV M6 Extra 5ace:6193 [12465.029659] saa7134: card=145 -> AVerMedia MiniPCI DVB-T Hybrid M103 1461:f636 1461:f736 [12465.029669] saa7134: card=146 -> ASUSTeK P7131 Analog [12465.029676] saa7134: card=147 -> Asus Tiger 3in1 1043:4878 [12465.029684] saa7134: card=148 -> Encore ENLTV-FM v5.3 1a7f:2008 [12465.029693] saa7134: card=149 -> Avermedia PCI pure analog (M135A) 1461:f11d [12465.029701] saa7134: card=150 -> Zogis Real Angel 220 [12465.029708] saa7134: card=151 -> ADS Tech Instant HDTV 1421:0380 [12465.029716] saa7134: card=152 -> Asus Tiger Rev:1.00 1043:4857 [12465.029725] saa7134: card=153 -> Kworld Plus TV Analog Lite PCI 17de:7128 [12465.029733] saa7134: card=154 -> Avermedia AVerTV GO 007 FM Plus 1461:f31d [12465.029742] saa7134: card=155 -> Hauppauge WinTV-HVR1150 ATSC/QAM-Hybrid 0070:6706 0070:6708 [12465.029752] saa7134: card=156 -> Hauppauge WinTV-HVR1120 DVB-T/Hybrid 0070:6707 0070:6709 0070:670a [12465.029763] saa7134: card=157 -> Avermedia AVerTV Studio 507UA 1461:a11b [12465.029772] saa7134: card=158 -> AVerMedia Cardbus TV/Radio (E501R) 1461:b7e9 [12465.029780] saa7134: card=159 -> Beholder BeholdTV 505 RDS 0000:505b [12465.029789] saa7134: card=160 -> Beholder BeholdTV 507 RDS 0000:5071 [12465.029797] saa7134: card=161 -> Beholder BeholdTV 507 RDS 0000:507b [12465.029806] saa7134: card=162 -> Beholder BeholdTV 607 FM 5ace:6071 [12465.029815] saa7134: card=163 -> Beholder BeholdTV 609 FM 5ace:6090 [12465.029823] saa7134: card=164 -> Beholder BeholdTV 609 FM 5ace:6091 [12465.029832] saa7134: card=165 -> Beholder BeholdTV 607 RDS 5ace:6072 [12465.029840] saa7134: card=166 -> Beholder BeholdTV 607 RDS 5ace:6073 [12465.029849] saa7134: card=167 -> Beholder BeholdTV 609 RDS 5ace:6092 [12465.029857] saa7134: card=168 -> Beholder BeholdTV 609 RDS 5ace:6093 [12465.029866] saa7134: card=169 -> Compro VideoMate S350/S300 185b:c900 [12465.029874] saa7134: card=170 -> AverMedia AverTV Studio 505 1461:a115 [12465.029883] saa7134: card=171 -> Beholder BeholdTV X7 5ace:7595 [12465.029892] saa7134: card=172 -> RoverMedia TV Link Pro FM 19d1:0138 [12465.029900] saa7134: card=173 -> Zolid Hybrid TV Tuner PCI 1131:2004 [12465.029909] saa7134: card=174 -> Asus Europa Hybrid OEM 1043:4847 [12465.029917] saa7134: card=175 -> Leadtek Winfast DTV1000S 107d:6655 [12465.029926] saa7134: card=176 -> Beholder BeholdTV 505 RDS 0000:5051 [12465.029934] saa7134: card=177 -> Hawell HW-404M7 [12465.029941] saa7134: card=178 -> Beholder BeholdTV H7 [12465.029948] saa7134: card=179 -> Beholder BeholdTV A7 [12465.029955] saa7134: card=180 -> Avermedia PCI M733A 1461:4155 1461:4255 [12465.029967] saa7130[0]: subsystem: 1131:0000, board: UNKNOWN/GENERIC [card=0,autodetected] [12465.030033] saa7130[0]: can't get MMIO memory @ 0x0 [12465.030051] saa7134: probe of 0000:01:07.0 failed with error -16 [12465.053892] saa7134 ALSA driver for DMA sound loaded [12465.053900] saa7134 ALSA: no saa7134 cards found tvtime-scanner Leyendo la configuración de /etc/tvtime/tvtime.xml Leyendo la configuración de /home/ricardo/.tvtime/tvtime.xml Escaneando usando la norma de TV NTSC. /home/ricardo/.tvtime/stationlist.xml: No existing NTSC station list "Custom". videoinput: Cannot open capture device /dev/video0: No existe el dispositivo o la dirección

    Read the article

  • SQL SERVER – Extending SQL Azure with Azure worker role – Guest Post by Paras Doshi

    - by pinaldave
    This is guest post by Paras Doshi. Paras Doshi is a research Intern at SolidQ.com and a Microsoft student partner. He is currently working in the domain of SQL Azure. SQL Azure is nothing but a SQL server in the cloud. SQL Azure provides benefits such as on demand rapid provisioning, cost-effective scalability, high availability and reduced management overhead. To see an introduction on SQL Azure, check out the post by Pinal here In this article, we are going to discuss how to extend SQL Azure with the Azure worker role. In other words, we will attempt to write a custom code and host it in the Azure worker role; the aim is to add some features that are not available with SQL Azure currently or features that need to be customized for flexibility. This way we extend the SQL Azure capability by building some solutions that run on Azure as worker roles. To understand Azure worker role, think of it as a windows service in cloud. Azure worker role can perform background processes, and to handle processes such as synchronization and backup, it becomes our ideal tool. First, we will focus on writing a worker role code that synchronizes SQL Azure databases. Before we do so, let’s see some scenarios in which synchronization between SQL Azure databases is beneficial: scaling out access over multiple databases enables us to handle workload efficiently As of now, SQL Azure database can be hosted in one of any six datacenters. By synchronizing databases located in different data centers, one can extend the data by enabling access to geographically distributed data Let us see some scenarios in which SQL server to SQL Azure database synchronization is beneficial To backup SQL Azure database on local infrastructure Rather than investing in local infrastructure for increased workloads, such workloads could be handled by cloud Ability to extend data to different datacenters located across the world to enable efficient data access from remote locations Now, let us develop cloud-based app that synchronizes SQL Azure databases. For an Introduction to developing cloud based apps, click here Now, in this article, I aim to provide a bird’s eye view of how a code that synchronizes SQL Azure databases look like and then list resources that can help you develop the solution from scratch. Now, if you newly add a worker role to the cloud-based project, this is how the code will look like. (Note: I have added comments to the skeleton code to point out the modifications that will be required in the code to carry out the SQL Azure synchronization. Note the placement of Setup() and Sync() function.) Click here (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-1-for-extending-sql-azure-with-azure-worker-role1.pdf ) Enabling SQL Azure databases synchronization through sync framework is a two-step process. In the first step, the database is provisioned and sync framework creates tracking tables, stored procedures, triggers, and tables to store metadata to enable synchronization. This is one time step. The code for the same is put in the setup() function which is called once when the worker role starts. Now, the second step is continuous (or on demand) synchronization of SQL Azure databases by propagating changes between databases. This is done on a continuous basis by calling the sync() function in the while loop. The code logic to synchronize changes between SQL Azure databases should be put in the sync() function. Discussing the coding part step by step is out of the scope of this article. Therefore, let me suggest you a resource, which is given here. Also, note that before you start developing the code, you will need to install SYNC framework 2.1 SDK (download here). Further, you will reference some libraries before you start coding. Details regarding the same are available in the article that I just pointed to. You will be charged for data transfers if the databases are not in the same datacenter. For pricing information, go here Currently, a tool named DATA SYNC, which is built on top of sync framework, is available in CTP that allows SQL Azure <-> SQL server and SQL Azure <-> SQL Azure synchronization (without writing single line of code); however, in some cases, the custom code shown in this blogpost provides flexibility that is not available with Data SYNC. For instance, filtering is not supported in the SQL Azure DATA SYNC CTP2; if you wish to have such a functionality now, then you have the option of developing a custom code using SYNC Framework. Now, this code can be easily extended to synchronize at some schedule. Let us say we want the databases to get synchronized every day at 10:00 pm. This is what the code will look like now: (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-2-for-extending-sql-azure-with-azure-worker-role.pdf) Don’t you think that by writing such a code, we are imitating the functionality provided by the SQL server agent for a SQL server? Think about it. We are scheduling our administrative task by writing custom code – in other words, we have developed a “Light weight SQL server agent for SQL Azure!” Since the SQL server agent is not currently available in cloud, we have developed a solution that enables us to schedule tasks, and thus we have extended SQL Azure with the Azure worker role! Now if you wish to track jobs, you can do so by storing this data in SQL Azure (or Azure tables). The reason is that Windows Azure is a stateless platform, and we will need to store the state of the job ourselves and the choice that you have is SQL Azure or Azure tables. Note that this solution requires custom code and also it is not UI driven; however, for now, it can act as a temporary solution until SQL server agent is made available in the cloud. Moreover, this solution does not encompass functionalities that a SQL server agent provides, but it does open up an interesting avenue to schedule some of the tasks such as backup and synchronization of SQL Azure databases by writing some custom code in the Azure worker role. Now, let us see one more possibility – i.e., running BCP through a worker role in Azure-hosted services and then uploading the backup files either locally or on blobs. If you upload it locally, then consider the data transfer cost. If you upload it to blobs residing in the same datacenter, then no transfer cost applies but the cost on blob size applies. So, before choosing the option, you need to evaluate your preferences keeping the cost associated with each option in mind. In this article, I have shown that Azure worker role solution could be developed to synchronize SQL Azure databases. Moreover, a light-weight SQL server agent for SQL Azure can be developed. Also we discussed the possibility of running BCP through a worker role in Azure-hosted services for backing up our precious SQL Azure data. Thus, we can extend SQL Azure with the Azure worker role. But remember: you will be charged for running Azure worker roles. So at the end of the day, you need to ask – am I willing to build a custom code and pay money to achieve this functionality? I hope you found this blog post interesting. If you have any questions/feedback, you can comment below or you can mail me at Paras[at]student-partners[dot]com Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Azure, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Announcing Windows Azure Mobile Services

    - by ScottGu
    I’m excited to announce a new capability we are adding to Windows Azure today: Windows Azure Mobile Services Windows Azure Mobile Services makes it incredibly easy to connect a scalable cloud backend to your client and mobile applications.  It allows you to easily store structured data in the cloud that can span both devices and users, integrate it with user authentication, as well as send out updates to clients via push notifications. Today’s release enables you to add these capabilities to any Windows 8 app in literally minutes, and provides a super productive way for you to quickly build out your app ideas.  We’ll also be adding support to enable these same scenarios for Windows Phone, iOS, and Android devices soon. Read this getting started tutorial to walkthrough how you can build (in less than 5 minutes) a simple Windows 8 “Todo List” app that is cloud enabled using Windows Azure Mobile Services.  Or watch this video of me showing how to do it step by step. Getting Started If you don’t already have a Windows Azure account, you can sign up for a no-obligation Free Trial.  Once you are signed-up, click the “preview features” section under the “account” tab of the www.windowsazure.com website and enable your account to support the “Mobile Services” preview.   Instructions on how to enable this can be found here. Once you have the mobile services preview enabled, log into the Windows Azure Portal, click the “New” button and choose the new “Mobile Services” icon to create your first mobile backend.  Once created, you’ll see a quick-start page like below with instructions on how to connect your mobile service to an existing Windows 8 client app you have already started working on, or how to create and connect a brand-new Windows 8 client app with it: Read this getting started tutorial to walkthrough how you can build (in less than 5 minutes) a simple Windows 8 “Todo List” app  that stores data in Windows Azure. Storing Data in the Cloud Storing data in the cloud with Windows Azure Mobile Services is incredibly easy.  When you create a Windows Azure Mobile Service, we automatically associate it with a SQL Database inside Windows Azure.  The Windows Azure Mobile Service backend then provides built-in support for enabling remote apps to securely store and retrieve data from it (using secure REST end-points utilizing a JSON-based ODATA format) – without you having to write or deploy any custom server code.  Built-in management support is provided within the Windows Azure portal for creating new tables, browsing data, setting indexes, and controlling access permissions. This makes it incredibly easy to connect client applications to the cloud, and enables client developers who don’t have a server-code background to be productive from the very beginning.  They can instead focus on building the client app experience, and leverage Windows Azure Mobile Services to provide the cloud backend services they require.  Below is an example of client-side Windows 8 C#/XAML code that could be used to query data from a Windows Azure Mobile Service.  Client-side C# developers can write queries like this using LINQ and strongly typed POCO objects, which are then translated into HTTP REST queries that run against a Windows Azure Mobile Service.   Developers don’t have to write or deploy any custom server-side code in order to enable client-side code below to execute and asynchronously populate their client UI: Because Mobile Services is part of Windows Azure, developers can later choose to augment or extend their initial solution and add custom server functionality and more advanced logic if they want.  This provides maximum flexibility, and enables developers to grow and extend their solutions to meet any needs. User Authentication and Push Notifications Windows Azure Mobile Services also make it incredibly easy to integrate user authentication/authorization and push notifications within your applications.  You can use these capabilities to enable authentication and fine grain access control permissions to the data you store in the cloud, as well as to trigger push notifications to users/devices when the data changes.  Windows Azure Mobile Services supports the concept of “server scripts” (small chunks of server-side script that executes in response to actions) that make it really easy to enable these scenarios. Below are some tutorials that walkthrough common authentication/authorization/push scenarios you can do with Windows Azure Mobile Services and Windows 8 apps: Enabling User Authentication Authorizing Users  Get Started with Push Notifications Push Notifications to multiple Users Manage and Monitor your Mobile Service Just like with every other service in Windows Azure, you can monitor usage and metrics of your mobile service backend using the “Dashboard” tab within the Windows Azure Portal. The dashboard tab provides a built-in monitoring view of the API calls, Bandwidth, and server CPU cycles of your Windows Azure Mobile Service.   You can also use the “Logs” tab within the portal to review error messages.  This makes it easy to monitor and track how your application is doing. Scale Up as Your Business Grows Windows Azure Mobile Services now allows every Windows Azure customer to create and run up to 10 Mobile Services in a free, shared/multi-tenant hosting environment (where your mobile backend will be one of multiple apps running on a shared set of server resources).  This provides an easy way to get started on projects at no cost beyond the database you connect your Windows Azure Mobile Service to (note: each Windows Azure free trial account also includes a 1GB SQL Database that you can use with any number of apps or Windows Azure Mobile Services). If your client application becomes popular, you can click the “Scale” tab of your Mobile Service and switch from “Shared” to “Reserved” mode.  Doing so allows you to isolate your apps so that you are the only customer within a virtual machine.  This allows you to elastically scale the amount of resources your apps use – allowing you to scale-up (or scale-down) your capacity as your traffic grows: With Windows Azure you pay for compute capacity on a per-hour basis – which allows you to scale up and down your resources to match only what you need.  This enables a super flexible model that is ideal for new mobile app scenarios, as well as startups who are just getting going.  Summary I’ve only scratched the surface of what you can do with Windows Azure Mobile Services – there are a lot more features to explore.  With Windows Azure Mobile Services you’ll be able to build mobile app experiences faster than ever, and enable even better user experiences – by connecting your client apps to the cloud. Visit the Windows Azure Mobile Services development center to learn more, and build your first Windows 8 app connected with Windows Azure today.  And read this getting started tutorial to walkthrough how you can build (in less than 5 minutes) a simple Windows 8 “Todo List” app that is cloud enabled using Windows Azure Mobile Services. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Running a simple integration scenario using the Oracle Big Data Connectors on Hadoop/HDFS cluster

    - by hamsun
    Between the elephant ( the tradional image of the Hadoop framework) and the Oracle Iron Man (Big Data..) an english setter could be seen as the link to the right data Data, Data, Data, we are living in a world where data technology based on popular applications , search engines, Webservers, rich sms messages, email clients, weather forecasts and so on, have a predominant role in our life. More and more technologies are used to analyze/track our behavior, try to detect patterns, to propose us "the best/right user experience" from the Google Ad services, to Telco companies or large consumer sites (like Amazon:) ). The more we use all these technologies, the more we generate data, and thus there is a need of huge data marts and specific hardware/software servers (as the Exadata servers) in order to treat/analyze/understand the trends and offer new services to the users. Some of these "data feeds" are raw, unstructured data, and cannot be processed effectively by normal SQL queries. Large scale distributed processing was an emerging infrastructure need and the solution seemed to be the "collocation of compute nodes with the data", which in turn leaded to MapReduce parallel patterns and the development of the Hadoop framework, which is based on MapReduce and a distributed file system (HDFS) that runs on larger clusters of rather inexpensive servers. Several Oracle products are using the distributed / aggregation pattern for data calculation ( Coherence, NoSql, times ten ) so once that you are familiar with one of these technologies, lets says with coherence aggregators, you will find the whole Hadoop, MapReduce concept very similar. Oracle Big Data Appliance is based on the Cloudera Distribution (CDH), and the Oracle Big Data Connectors can be plugged on a Hadoop cluster running the CDH distribution or equivalent Hadoop clusters. In this paper, a "lab like" implementation of this concept is done on a single Linux X64 server, running an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0, and a single node Apache hadoop-1.2.1 HDFS cluster, using the SQL connector for HDFS. The whole setup is fairly simple: Install on a Linux x64 server ( or virtual box appliance) an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 server Get the Apache Hadoop distribution from: http://mir2.ovh.net/ftp.apache.org/dist/hadoop/common/hadoop-1.2.1. Get the Oracle Big Data Connectors from: http://www.oracle.com/technetwork/bdc/big-data-connectors/downloads/index.html?ssSourceSiteId=ocomen. Check the java version of your Linux server with the command: java -version java version "1.7.0_40" Java(TM) SE Runtime Environment (build 1.7.0_40-b43) Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode) Decompress the hadoop hadoop-1.2.1.tar.gz file to /u01/hadoop-1.2.1 Modify your .bash_profile export HADOOP_HOME=/u01/hadoop-1.2.1 export PATH=$PATH:$HADOOP_HOME/bin export HIVE_HOME=/u01/hive-0.11.0 export PATH=$PATH:$HADOOP_HOME/bin:$HIVE_HOME/bin (also see my sample .bash_profile) Set up ssh trust for Hadoop process, this is a mandatory step, in our case we have to establish a "local trust" as will are using a single node configuration copy the new public keys to the list of authorized keys connect and test the ssh setup to your localhost: We will run a "pseudo-Hadoop cluster", in what is called "local standalone mode", all the Hadoop java components are running in one Java process, this is enough for our demo purposes. We need to "fine tune" some Hadoop configuration files, we have to go at our $HADOOP_HOME/conf, and modify the files: core-site.xml hdfs-site.xml mapred-site.xml check that the hadoop binaries are referenced correctly from the command line by executing: hadoop -version As Hadoop is managing our "clustered HDFS" file system we have to create "the mount point" and format it , the mount point will be declared to core-site.xml as: The layout under the /u01/hadoop-1.2.1/data will be created and used by other hadoop components (MapReduce = /mapred/...) HDFS is using the /dfs/... layout structure format the HDFS hadoop file system: Start the java components for the HDFS system As an additional check, you can use the GUI Hadoop browsers to check the content of your HDFS configurations: Once our HDFS Hadoop setup is done you can use the HDFS file system to store data ( big data : )), and plug them back and forth to Oracle Databases by the means of the Big Data Connectors ( which is the next configuration step). You can create / use a Hive db, but in our case we will make a simple integration of "raw data" , through the creation of an External Table to a local Oracle instance ( on the same Linux box, we run the Hadoop HDFS one node cluster and one Oracle DB). Download some public "big data", I use the site: http://france.meteofrance.com/france/observations, from where I can get *.csv files for my big data simulations :). Here is the data layout of my example file: Download the Big Data Connector from the OTN (oraosch-2.2.0.zip), unzip it to your local file system (see picture below) Modify your environment in order to access the connector libraries , and make the following test: [oracle@dg1 bin]$./hdfs_stream Usage: hdfs_stream locationFile [oracle@dg1 bin]$ Load the data to the Hadoop hdfs file system: hadoop fs -mkdir bgtest_data hadoop fs -put obsFrance.txt bgtest_data/obsFrance.txt hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$ hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$hadoop fs -ls hdfs:///user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt Check the content of the HDFS with the browser UI: Start the Oracle database, and run the following script in order to create the Oracle database user, the Oracle directories for the Oracle Big Data Connector (dg1 it’s my own db id replace accordingly yours): #!/bin/bash export ORAENV_ASK=NO export ORACLE_SID=dg1 . oraenv sqlplus /nolog <<EOF CONNECT / AS sysdba; CREATE OR REPLACE DIRECTORY osch_bin_path AS '/u01/orahdfs-2.2.0/bin'; CREATE USER BGUSER IDENTIFIED BY oracle; GRANT CREATE SESSION, CREATE TABLE TO BGUSER; GRANT EXECUTE ON sys.utl_file TO BGUSER; GRANT READ, EXECUTE ON DIRECTORY osch_bin_path TO BGUSER; CREATE OR REPLACE DIRECTORY BGT_LOG_DIR as '/u01/BG_TEST/logs'; GRANT READ, WRITE ON DIRECTORY BGT_LOG_DIR to BGUSER; CREATE OR REPLACE DIRECTORY BGT_DATA_DIR as '/u01/BG_TEST/data'; GRANT READ, WRITE ON DIRECTORY BGT_DATA_DIR to BGUSER; EOF Put the following in a file named t3.sh and make it executable, hadoop jar $OSCH_HOME/jlib/orahdfs.jar \ oracle.hadoop.exttab.ExternalTable \ -D oracle.hadoop.exttab.tableName=BGTEST_DP_XTAB \ -D oracle.hadoop.exttab.defaultDirectory=BGT_DATA_DIR \ -D oracle.hadoop.exttab.dataPaths="hdfs:///user/oracle/bgtest_data/obsFrance.txt" \ -D oracle.hadoop.exttab.columnCount=7 \ -D oracle.hadoop.connection.url=jdbc:oracle:thin:@//localhost:1521/dg1 \ -D oracle.hadoop.connection.user=BGUSER \ -D oracle.hadoop.exttab.printStackTrace=true \ -createTable --noexecute then test the creation fo the external table with it: [oracle@dg1 samples]$ ./t3.sh ./t3.sh: line 2: /u01/orahdfs-2.2.0: Is a directory Oracle SQL Connector for HDFS Release 2.2.0 - Production Copyright (c) 2011, 2013, Oracle and/or its affiliates. All rights reserved. Enter Database Password:] The create table command was not executed. The following table would be created. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081035-74-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files would be created. osch-20131022081035-74-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt Then remove the --noexecute flag and create the external Oracle table for the Hadoop data. Check the results: The create table command succeeded. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081719-3239-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files were created. osch-20131022081719-3239-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt This is the view from the SQL Developer: and finally the number of lines in the oracle table, imported from our Hadoop HDFS cluster SQL select count(*) from "BGUSER"."BGTEST_DP_XTAB"; COUNT(*) ---------- 1151 In a next post we will integrate data from a Hive database, and try some ODI integrations with the ODI Big Data connector. Our simplistic approach is just a step to show you how these unstructured data world can be integrated to Oracle infrastructure. Hadoop, BigData, NoSql are great technologies, they are widely used and Oracle is offering a large integration infrastructure based on these services. Oracle University presents a complete curriculum on all the Oracle related technologies: NoSQL: Introduction to Oracle NoSQL Database Using Oracle NoSQL Database Big Data: Introduction to Big Data Oracle Big Data Essentials Oracle Big Data Overview Oracle Data Integrator: Oracle Data Integrator 12c: New Features Oracle Data Integrator 11g: Integration and Administration Oracle Data Integrator: Administration and Development Oracle Data Integrator 11g: Advanced Integration and Development Oracle Coherence 12c: Oracle Coherence 12c: New Features Oracle Coherence 12c: Share and Manage Data in Clusters Oracle Coherence 12c: Oracle GoldenGate 11g: Fundamentals for Oracle Oracle GoldenGate 11g: Fundamentals for SQL Server Oracle GoldenGate 11g Fundamentals for Oracle Oracle GoldenGate 11g Fundamentals for DB2 Oracle GoldenGate 11g Fundamentals for Teradata Oracle GoldenGate 11g Fundamentals for HP NonStop Oracle GoldenGate 11g Management Pack: Overview Oracle GoldenGate 11g Troubleshooting and Tuning Oracle GoldenGate 11g: Advanced Configuration for Oracle Other Resources: Apache Hadoop : http://hadoop.apache.org/ is the homepage for these technologies. "Hadoop Definitive Guide 3rdEdition" by Tom White is a classical lecture for people who want to know more about Hadoop , and some active "googling " will also give you some more references. About the author: Eugene Simos is based in France and joined Oracle through the BEA-Weblogic Acquisition, where he worked for the Professional Service, Support, end Education for major accounts across the EMEA Region. He worked in the banking sector, ATT, Telco companies giving him extensive experience on production environments. Eugen currently specializes in Oracle Fusion Middleware teaching an array of courses on Weblogic/Webcenter, Content,BPM /SOA/Identity-Security/GoldenGate/Virtualisation/Unified Comm Suite) throughout the EMEA region.

    Read the article

  • CodePlex Daily Summary for Wednesday, April 07, 2010

    CodePlex Daily Summary for Wednesday, April 07, 2010New ProjectsAStar.net: AStar.net is a project for compute the A* path finding algorithm. It expose classes and interface that can be used for all purpose with multi-threa...Auto Complete for MOSS 2007: Auto Complete for MOSS 2007 (WSS 3.0) makes it easier for all user to fill list of nessesary string in the field. AutoPoco: AutoPoco is a framework with the purpose of fluently building test data from Plain Old CLR ObjectsBlueProject: BlueProjectColor Picker for MOSS 2007: Color picker for MOSS 2007 (WSS 3.0) makes it easier for some user to choose color in the field. ComBrowser2: ComBrowser2DCommunication: Communication Components and Software By Delphi On Windows.DirST: Allows one to replicate a DIRectory STructure without copying files. Written in C#.Discussion column for MOSS 2007: Discussion column for MOSS 2007/ (WSS 3.0) is a field column for different type lists such as Custom List, Document Library, Issue Tracking, not on...Effect Custom Tool for Visual Studio: Effect Custom Tool for Visual Studio is a visual studio 2008 extension that helps you generate c# classes from effect (*.fx) files for use with Xna...Energon: Energon is a framework to create and run software energy consumption tests. Requires a couple of PC with a TCP connections, a phidgets ammeter and ...Fiansa: Proyecto FiansaFirstSpark: FirstSpark is a sample Spark View Engine projectISOM: Internetowy System Obsługi Magisterek La Carta Mas Alta: La Carta Mas Alta is an open source card game totally written in PHP and HTML. This cross-platform and cross-browser game was tested under BeOS, Li...Near forums - ASP.NET MVC forum engine: Open source SEO friendly ASP.NET MVC forum engine. Features: Navigation in forums, topics and tags; login with Facebook Connect and Single sign-on;...PEC: Still editingPowershell Zip Export/Import Cmdlet Module: Powershellzip is a powershell module with a set of Cmdlets for zip file export, import and processingProgress bar for MOSS 2007: Progress bar for MOSS 2007 (WSS 3.0) makes it easier for some user to display progress in the field in percent (0-100%). SharePoint Accelerators: This project delivers a number of SharePoint accelerators that make your every-day SharePoint life easier.Shweet: SharePoint 2010 Team Messaging built with Pex: Shweet is a simple SharePoint Foundation 2010 application that allows teams to do messaging and subscriptions in the style similar to twitter. De...SilverlightEncoder: Video and audio encoder for Silverlight 4 Out-of-BrowserStarMath: Static Array Math Library: While there are already countless math libraries for performing common matrix/array functions, StarMath is distinguished by its simplicity and flex...StringToNumber: StringToNumber is a .NET library for parsing numbers in their written form into their numeric equivalent. It's developed in C#. It was created to...SubstitutionITIS: <project name="Substitution" language="c-sharp" to="myschool" whatdo="manageSubstitutionTeachers" />TamTam.SharePoint2010.LinkedIn: SP2010 and LinkedIn working togetherThink And Explore: Think my own wayusenet-o-matic: small mobile optimised webscript to search various NZB sources like newzbin and NZBmatrix and send donwloads to your SABnzbd server. Valid HTML5 Templates for VisualStudio: This project will host valid HTML templates for Visual Studio. The primary focus however will be on the newer standard HTML5 and Visual Studio 2010XSS Attack: This tool will simulate an attack on your database and update up to 5000 rows in every table and replace your strings in your database with random ...Yazgelistir Beta: Yazgelistir'in beta sitesiNew ReleasesAStar.net: AStar.net 1.0 downloads: AStar.net 1.0 downloadsAvalaible downloads:Astar.net dll - Runtime library ready to be included in a project. Astar.net source project - The Visu...AutoPoco: 0.1 - Initial Release: This is the initial release, changes pending, lots left to do, check it out though!Bag of Tricks: Bag of Tricks - WPF - Libraries and Sample App: Here is a drop of the Bag of Tricks. It targets the 3.5 Client Profile.Boxee Launcher: Boxee Launcher 1.0.1.4: Taskbar will now be hiddenDesignit Umbraco Newsletter Package: ver 1.0.0 beta1: Please remember this is a beta version. If you have any problems or issues, don't hesitate to use the issue tracker and/or forum on this site. Ple...Effect Custom Tool for Visual Studio: Effect Custom Tool for Visual Studio 2008: Effect Custom Tool for Visual Studio is a visual studio 2008 extension that helps you generate c# classes from effect (*.fx) files for use with Xna...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 1.1: Fluent Ribbon Control Suite 1.1 Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples Foundation (Tabs, Groups, Contextual Tab...GsGrid: gsgrid 1.6.5: gsgrid 1.6.5Home Access Plus+: v3.2.5.2: v3.2.5.1 Release Change Log: Attempt to fix Domain Admin Lookup box File Changes: ~/bin/CHS Extranet.dll ~/bin/CHS Extranet.pdbHulu Launcher: Hulu Launcher 1.0.1.4: Will now hide taskbar.Icarus Scene Engine: Icarus Professional 2 Alpha 2a v 1.10.404.936: Alpha release 2 of Icarus Professional. This release includes: IcarusX: The ActiveX-based browser control for rendering IPX projects online. Icaru...kdar: KDAR 0.0.19: KDAR - Kernel Debugger Anti Rootkit - thread start notifier routine check added - registry callback сheck added - NDIS6 checks added - some bug i...Mobile Device Browser File: Mobile Device Browser File (2010-04-07): The Mobile Browser Definition File contains definitions for individual mobile devices and browsers. At run time, ASP.NET uses the information in th...MvcContrib: a Codeplex Foundation project: Portable Area Template: Use this Visual Studio 2008 Project template to create new portable areas. Drop this file in your Documents\Visual Studio 2008\Templates\ProjectTe...Office Apps: 0.8.8: new ui for Document.Viewer bug fix'sProtoforma | Tactica Adversa: Skilful 0.3.4.477: RC1ROOT Builder: ROOT Builder 1.41: Simplifies building of ROOT on the windows platform by generating a Visual Studio C make project that will build and run (and debug ROOT). See Inst...SharePoint Accelerators: Search AutoComplete for SharePoint Lists: Search AutoComplete for SharePoint List is a Web Part that allows you to search contents of a SharePoint. Search is interactive and offers you resu...SharePoint Labs: SPLab4002A-FRA-Level100: SPLab4002A-FRA-Level100 This SharePoint Lab will teach you the 2nd best practice you should apply when writing code with the SharePoint API. Lab La...SharePoint Labs: SPLab4003A-FRA-Level100: SPLab4003A-FRA-Level100 This SharePoint Lab will teach you the 3rd best practice you should apply when writing code with the SharePoint API. Lab La...SQL Compact data and schema script utility: Version 3.0: This release contains 4 downloadable files: - SSMS 2008 scripting add-in - SQL Server 2005/2008 command line utility to generate a script with sch...StarMath: Static Array Math Library: StarMath Source Files: The source file package includes the main library StarMath.dll (and it's source files), and an example exe project to invoke commands from StarMath.stefvanhooijdonk.com: Powershell Solution Install Script: Powershell Example script to install a SP2010 Solution, which actually waits for the Retraction and Deployment jobs.TamTam.SharePoint2010.LinkedIn: 0.0.0.1: How to use/install Small instruction to use this code/solution step 1 Add the Farm Solution to your SP2010 installation step 2 Go to the MySite H...usenet-o-matic: V 0.2: supports Newzbin and NZBmatrix as index sources and SABnzbd as download serverVCC: Latest build, v2.1.30406.0: Automatic drop of latest buildVisual Studio DSite: E-Z Image To PDF Converter Beta: This simple little program can convert all common image formats into pdfs and can even encrypt the pdf with an encrpytion key.Web and Load Test Plugins for Visual Studio Team Test: Release 2.0: Release 2.0 is targeted at VS 2010. VS 2010 exposes major new extensibility points: 1) Recorder plugins enable you to do custom correlations and o...Wicked Compression ASP.NET HTTP Module: WickedCompressionModule 4.0 Alpha: New Features, New Enhancements! - Visual Studio 2010 Projects! - Support for ASP.NET 2.0, 3.5, and 4.0 - AJAX Support for ASP.NET 3.5 and 4.0 - Bu...x5s - test encodings and character transformations to find XSS hotspots: x5s v1.0.0 beta: This is the v1.0 beta release of x5s. All feedback welcome in planning for the next release. Make sure Fiddler is installed prior to running th...xvanneste: Silverlight SharePoint: Fichiers du webcast sur silverlight: Silverlight OM Silverlight Webpart Silverlight Embedded ressource Silverlight List ViewYet Another Web App Monitoring Tool (YAWAMT): yawamt v0.5: A new release has seen the light :-) I've lowered the release to version 0.5 but changed some major things: - deletion of URLS works - settings can...Zinc Launcher: Zinc Launcher 1.0.1.1: Taskbar will now be hidden Delay to show Zinc was reduced to improve responsiveness.Most Popular ProjectsRawrWBFS ManagerMicrosoft SQL Server Product Samples: DatabaseASP.NET Ajax LibrarySilverlight ToolkitAJAX Control ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesFacebook Developer ToolkitMost Active ProjectsGraffiti CMSnopCommerce. Open Source online shop e-commerce solution.RawrFacebook Developer ToolkitShweet: SharePoint 2010 Team Messaging built with Pexpatterns & practices – Enterprise LibraryNcqrs Framework - The CQRS framework for .NETjQuery Library for SharePoint Web ServicesIonics Isapi Rewrite FilterAcadsys

    Read the article

  • SQL Spatial: Getting “nearest” calculations working properly

    - by Rob Farley
    If you’ve ever done spatial work with SQL Server, I hope you’ve come across the ‘nearest’ problem. You have five thousand stores around the world, and you want to identify the one that’s closest to a particular place. Maybe you want the store closest to the LobsterPot office in Adelaide, at -34.925806, 138.605073. Or our new US office, at 42.524929, -87.858244. Or maybe both! You know how to do this. You don’t want to use an aggregate MIN or MAX, because you want the whole row, telling you which store it is. You want to use TOP, and if you want to find the closest store for multiple locations, you use APPLY. Let’s do this (but I’m going to use addresses in AdventureWorks2012, as I don’t have a list of stores). Oh, and before I do, let’s make sure we have a spatial index in place. I’m going to use the default options. CREATE SPATIAL INDEX spin_Address ON Person.Address(SpatialLocation); And my actual query: WITH MyLocations AS (SELECT * FROM (VALUES ('LobsterPot Adelaide', geography::Point(-34.925806, 138.605073, 4326)),                        ('LobsterPot USA', geography::Point(42.524929, -87.858244, 4326))                ) t (Name, Geo)) SELECT l.Name, a.AddressLine1, a.City, s.Name AS [State], c.Name AS Country FROM MyLocations AS l CROSS APPLY (     SELECT TOP (1) *     FROM Person.Address AS ad     ORDER BY l.Geo.STDistance(ad.SpatialLocation)     ) AS a JOIN Person.StateProvince AS s     ON s.StateProvinceID = a.StateProvinceID JOIN Person.CountryRegion AS c     ON c.CountryRegionCode = s.CountryRegionCode ; Great! This is definitely working. I know both those City locations, even if the AddressLine1s don’t quite ring a bell. I’m sure I’ll be able to find them next time I’m in the area. But of course what I’m concerned about from a querying perspective is what’s happened behind the scenes – the execution plan. This isn’t pretty. It’s not using my index. It’s sucking every row out of the Address table TWICE (which sucks), and then it’s sorting them by the distance to find the smallest one. It’s not pretty, and it takes a while. Mind you, I do like the fact that it saw an indexed view it could use for the State and Country details – that’s pretty neat. But yeah – users of my nifty website aren’t going to like how long that query takes. The frustrating thing is that I know that I can use the index to find locations that are within a particular distance of my locations quite easily, and Microsoft recommends this for solving the ‘nearest’ problem, as described at http://msdn.microsoft.com/en-au/library/ff929109.aspx. Now, in the first example on this page, it says that the query there will use the spatial index. But when I run it on my machine, it does nothing of the sort. I’m not particularly impressed. But what we see here is that parallelism has kicked in. In my scenario, it’s split the data up into 4 threads, but it’s still slow, and not using my index. It’s disappointing. But I can persuade it with hints! If I tell it to FORCESEEK, or use my index, or even turn off the parallelism with MAXDOP 1, then I get the index being used, and it’s a thing of beauty! Part of the plan is here: It’s massive, and it’s ugly, and it uses a TVF… but it’s quick. The way it works is to hook into the GeodeticTessellation function, which is essentially finds where the point is, and works out through the spatial index cells that surround it. This then provides a framework to be able to see into the spatial index for the items we want. You can read more about it at http://msdn.microsoft.com/en-us/library/bb895265.aspx#tessellation – including a bunch of pretty diagrams. One of those times when we have a much more complex-looking plan, but just because of the good that’s going on. This tessellation stuff was introduced in SQL Server 2012. But my query isn’t using it. When I try to use the FORCESEEK hint on the Person.Address table, I get the friendly error: Msg 8622, Level 16, State 1, Line 1 Query processor could not produce a query plan because of the hints defined in this query. Resubmit the query without specifying any hints and without using SET FORCEPLAN. And I’m almost tempted to just give up and move back to the old method of checking increasingly large circles around my location. After all, I can even leverage multiple OUTER APPLY clauses just like I did in my recent Lookup post. WITH MyLocations AS (SELECT * FROM (VALUES ('LobsterPot Adelaide', geography::Point(-34.925806, 138.605073, 4326)),                        ('LobsterPot USA', geography::Point(42.524929, -87.858244, 4326))                ) t (Name, Geo)) SELECT     l.Name,     COALESCE(a1.AddressLine1,a2.AddressLine1,a3.AddressLine1),     COALESCE(a1.City,a2.City,a3.City),     s.Name AS [State],     c.Name AS Country FROM MyLocations AS l OUTER APPLY (     SELECT TOP (1) *     FROM Person.Address AS ad     WHERE l.Geo.STDistance(ad.SpatialLocation) < 1000     ORDER BY l.Geo.STDistance(ad.SpatialLocation)     ) AS a1 OUTER APPLY (     SELECT TOP (1) *     FROM Person.Address AS ad     WHERE l.Geo.STDistance(ad.SpatialLocation) < 5000     AND a1.AddressID IS NULL     ORDER BY l.Geo.STDistance(ad.SpatialLocation)     ) AS a2 OUTER APPLY (     SELECT TOP (1) *     FROM Person.Address AS ad     WHERE l.Geo.STDistance(ad.SpatialLocation) < 20000     AND a2.AddressID IS NULL     ORDER BY l.Geo.STDistance(ad.SpatialLocation)     ) AS a3 JOIN Person.StateProvince AS s     ON s.StateProvinceID = COALESCE(a1.StateProvinceID,a2.StateProvinceID,a3.StateProvinceID) JOIN Person.CountryRegion AS c     ON c.CountryRegionCode = s.CountryRegionCode ; But this isn’t friendly-looking at all, and I’d use the method recommended by Isaac Kunen, who uses a table of numbers for the expanding circles. It feels old-school though, when I’m dealing with SQL 2012 (and later) versions. So why isn’t my query doing what it’s supposed to? Remember the query... WITH MyLocations AS (SELECT * FROM (VALUES ('LobsterPot Adelaide', geography::Point(-34.925806, 138.605073, 4326)),                        ('LobsterPot USA', geography::Point(42.524929, -87.858244, 4326))                ) t (Name, Geo)) SELECT l.Name, a.AddressLine1, a.City, s.Name AS [State], c.Name AS Country FROM MyLocations AS l CROSS APPLY (     SELECT TOP (1) *     FROM Person.Address AS ad     ORDER BY l.Geo.STDistance(ad.SpatialLocation)     ) AS a JOIN Person.StateProvince AS s     ON s.StateProvinceID = a.StateProvinceID JOIN Person.CountryRegion AS c     ON c.CountryRegionCode = s.CountryRegionCode ; Well, I just wasn’t reading http://msdn.microsoft.com/en-us/library/ff929109.aspx properly. The following requirements must be met for a Nearest Neighbor query to use a spatial index: A spatial index must be present on one of the spatial columns and the STDistance() method must use that column in the WHERE and ORDER BY clauses. The TOP clause cannot contain a PERCENT statement. The WHERE clause must contain a STDistance() method. If there are multiple predicates in the WHERE clause then the predicate containing STDistance() method must be connected by an AND conjunction to the other predicates. The STDistance() method cannot be in an optional part of the WHERE clause. The first expression in the ORDER BY clause must use the STDistance() method. Sort order for the first STDistance() expression in the ORDER BY clause must be ASC. All the rows for which STDistance returns NULL must be filtered out. Let’s start from the top. 1. Needs a spatial index on one of the columns that’s in the STDistance call. Yup, got the index. 2. No ‘PERCENT’. Yeah, I don’t have that. 3. The WHERE clause needs to use STDistance(). Ok, but I’m not filtering, so that should be fine. 4. Yeah, I don’t have multiple predicates. 5. The first expression in the ORDER BY is my distance, that’s fine. 6. Sort order is ASC, because otherwise we’d be starting with the ones that are furthest away, and that’s tricky. 7. All the rows for which STDistance returns NULL must be filtered out. But I don’t have any NULL values, so that shouldn’t affect me either. ...but something’s wrong. I do actually need to satisfy #3. And I do need to make sure #7 is being handled properly, because there are some situations (eg, differing SRIDs) where STDistance can return NULL. It says so at http://msdn.microsoft.com/en-us/library/bb933808.aspx – “STDistance() always returns null if the spatial reference IDs (SRIDs) of the geography instances do not match.” So if I simply make sure that I’m filtering out the rows that return NULL… …then it’s blindingly fast, I get the right results, and I’ve got the complex-but-brilliant plan that I wanted. It just wasn’t overly intuitive, despite being documented. @rob_farley

    Read the article

  • Customize Team Build 2010 – Part 16: Specify the relative reference path

    In the series the following parts have been published Part 1: Introduction Part 2: Add arguments and variables Part 3: Use more complex arguments Part 4: Create your own activity Part 5: Increase AssemblyVersion Part 6: Use custom type for an argument Part 7: How is the custom assembly found Part 8: Send information to the build log Part 9: Impersonate activities (run under other credentials) Part 10: Include Version Number in the Build Number Part 11: Speed up opening my build process template Part 12: How to debug my custom activities Part 13: Get control over the Build Output Part 14: Execute a PowerShell script Part 15: Fail a build based on the exit code of a console application Part 16: Specify the relative reference path As I have already blogged about, it is not intuitive how to specify the paths where the build server has to look for references that are stored in Source Control. It is a common practice to store 3rd party libraries in Source Control, so they are available to everyone, everyone uses the same version of the libraries and updating a library can be done centrally. In Team Build 2010 these paths are specified as a parameter for MSBuild. What we will do in this post is building the values for this parameter based on the values in an argument. You are now pretty aware how to customize the build template, so let’s do the modifications in another way. Instead of opening the xaml file in the workflow designer, we open it in the XML editor. You can open it in the XML Editor by either selecting the Open with menu (see the context menu), or by choosing the View code option. To add this functionality we need to: Specify a new argument Add the argument to the metadata Build the absolute paths for the references and add these paths to the MSBuild arguments 1. Specify a new argument Locate at the top of the document the Members (which are the arguments) of the XAML and add the following line <x:Property Name="ReferencePaths" Type="InArgument(s:String[])" /> 2. Add the argument to the metadata Then locate the line <mtbw:ProcessParameterMetadataCollection> and paste the following line <mtbw:ProcessParameterMetadata Category="Misc" Description="The list of reference paths, relative to the root path in the Workspace mapping." DisplayName="Reference paths" ParameterName="ReferencePaths" /> 3. Build the absolute paths for the references and add these paths to the MSBuild arguments Now locate the place where the assignments are done to the variables used in the agent. And add the following lines after the last Assign activity         <Sequence DisplayName="Initialize ReferencePath" sap:VirtualizedContainerService.HintSize="464,428">           <Sequence.Variables>             <Variable x:TypeArguments="x:String" Name="ReferencePathsArgument">               <Variable.Default>                 <Literal x:TypeArguments="x:String" Value="" />               </Variable.Default>             </Variable>           </Sequence.Variables>           <sap:WorkflowViewStateService.ViewState>             <scg:Dictionary x:TypeArguments="x:String, x:Object">               <x:Boolean x:Key="IsExpanded">True</x:Boolean>             </scg:Dictionary>           </sap:WorkflowViewStateService.ViewState>           <ForEach x:TypeArguments="x:String" DisplayName="Iterate through the paths" sap:VirtualizedContainerService.HintSize="287,206" mtbwt:BuildTrackingParticipant.Importance="Low" Values="[ReferencePaths]">             <ActivityAction x:TypeArguments="x:String">               <ActivityAction.Argument>                 <DelegateInArgument x:TypeArguments="x:String" Name="path" />               </ActivityAction.Argument>               <Assign x:TypeArguments="x:String" DisplayName="Build ReferencePath argument" sap:VirtualizedContainerService.HintSize="257,100" mtbwt:BuildTrackingParticipant.Importance="Low"  To="[ReferencePathsArgument]" Value="[If(String.IsNullOrEmpty(ReferencePathsArgument), &quot;&quot;, ReferencePathsArgument + &quot;;&quot;) + IO.Path.Combine(SourcesDirectory, path)]" />             </ActivityAction>           </ForEach>           <Assign DisplayName="Append the reference paths to the MSBuild Arguments" sap:VirtualizedContainerService.HintSize="287,58">             <Assign.To>               <OutArgument x:TypeArguments="x:String">[MSBuildArguments]</OutArgument>             </Assign.To>             <Assign.Value>               <InArgument x:TypeArguments="x:String">[String.Format("{0} /p:ReferencePath=""{1}""", MSBuildArguments, ReferencePathsArgument)]</InArgument>             </Assign.Value>           </Assign>         </Sequence> Now you can use the template to specify the paths relative to SourcesDirectory. You can download the full solution at BuildProcess.zip. It will include the sources of every part and will continue to evolve.

    Read the article

  • CodePlex Daily Summary for Friday, December 31, 2010

    CodePlex Daily Summary for Friday, December 31, 2010Popular ReleasesFree Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.6 Released: Hi, Today we are releasing final version of Visifire, v3.6.6 with the following new feature: * TextDecorations property is implemented in Title for Chart. * TitleTextDecorations property is implemented in Axis. * MinPointHeight property is now applicable for Column and Bar Charts. Also this release includes few bug fixes: * ToolTipText property of DataSeries was not getting applied from Style. * Chart threw exception if IndicatorEnabled property was set to true and Too...Windows Weibo all in one for Sina Sohu and QQ: WeiBee V0.1: WeiBee is an all in one Twitter tool, which can update Wei Bo at the same time for websites. It intends to support t.sohu.com, t.sina.com.cn and t.qq.com.cn. If you have WeiBo at SOHU, SINA and QQ, you can try this tool to help you save time to open all the webpages to update your status. For any business opportunity, such as put advertise on China Twitter market, and to build a custom WeiBo tool, please reach my email box qq1800@gmail.com My official WeiBo is http://mediaroom.t.sohu.comStyleCop Compliant Visual Studio Code Snippets: Visual Studio Code Snippets - January 2011: StyleCop Compliant Visual Studio Code Snippets Visual Studio 2010 provides C# developers with 38 code snippets, enhancing developer productivty and increasing the consistency of the code. Within this project the original code snippets have been refactored to provide StyleCop compliant versions of the original code snippets while also adding many new code snippets. Within the January 2011 release you'll find 82 code snippets to make you more productive and the code you write more consistent!...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.2: Version: 2.0.0.2 (Milestone 2): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...eCompany: eCompany v0.2.0 Build 63: Version 0.2.0 Build 63: Added Splash screen & about box Added downloading of currencies when eCompany launched for the first time (must close any bug caused by no currency rate existing) Added corp creation when eCompany launched for the first time (for now, you didn't need to edit the company.xml file manually) You just need to decompress file "eCompany v0.2.0.63.zip" into your current eCompany install directory.SQL Monitor - tracking sql server activities: SQL Monitor 3.0 alpha 8: 1. added truncate table/defrag index/check db functions 2. improved alert 3. fixed problem with alert causing config file corrupted(hopefully)Analysis Services Stored Procedure Project: 1.3.5 Release: This release includes the following fixes and new functionality: Updates to GetCubeLastProcessedDate to work with perspectives Fixes to reports that call Discover functions improving drillthrough functions against perspectives improving ExecuteDrillthroughAndFixColumns logic fixing situation where MDX query calling certain ASSP sprocs which opened external connections caused deadlock to SSAS processing small fix to Partition code when DbColumnName property doesn't exist changes...DocX: DocX v1.0.0.11: Building Examples projectTo build the Examples project, download DocX.dll and add it as a reference to the project. OverviewThis version of DocX contains many bug fixes, it is a serious step towards a stable release. Added1) Unit testing project, 2) Examples project, 3) To many bug fixes to list here, see the source code change list history.Cosmos (C# Open Source Managed Operating System): 71406: This is the second release supporting the full line of Visual Studio 2010 editions. Changes since release 71246 include: Debug info is now stored in a single .cpdb file (which is a Firebird database) Keyboard input works now (using Console.ReadLine) Console colors work (using Console.ForegroundColor and .BackgroundColor)AutoLoL: AutoLoL v1.5.0: Added the all new Masteries Browser which replaces the Quick Open combobox AutoLoL will now attemt to create file associations for mastery (*.lolm) files Each Mastery Build can now contain keywords that the Masteries Browser will use for filtering Changed the way AutoLoL detects if another instance is already running Changed the format of the mastery files to allow more information stored in* Dialogs will now focus the Ok or Cancel button which allows the user to press Return to clo...Paint.NET PSD Plugin: 1.6.0: Handling of layer masks has been greatly improved. Improved reliability. Many PSD files that previously loaded in as garbage will now load in correctly. Parallelized loading. PSD files containing layer masks will load in a bit quicker thanks to the removal of the sequential bottleneck. Hidden layers are no longer made visible on save. Many thanks to the users who helped expose the layer masks problem: Rob Horowitz, M_Lyons10. Please keep sending in those bug reports and PSD repro files!Facebook C# SDK: 4.1.1: From 4.1.1 Release: Authentication bug fix caused by facebook change (error with redirects in Safari) Authenticator fix, always returning true From 4.1.0 Release Lots of bug fixes Removed Dynamic Runtime Language dependencies from non-dynamic platforms. Samples included in release for ASP.NET, MVC, Silverlight, Windows Phone 7, WPF, WinForms, and one Visual Basic Sample Changed internal serialization to use Json.net BREAKING CHANGE: Canvas Session is no longer supported. Use Signed...Catel - WPF and Silverlight MVVM library: 1.0.0: And there it is, the final release of Catel, and it is no longer a beta version!EnhSim: EnhSim 2.2.7 ALPHA: 2.2.7 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Mongoose has bee...Euro for Windows XP: ChangeRegionalSettings 1..0: *Rocket Framework (.Net 4.0): Rocket Framework for Windows V 1.0.0: Architecture is reviewed and adjusted in a way so that I can introduce the Web version and WPF version of this framework next. - Rocket.Core is introduced - Controller button functions revisited and updated - DB is renewed to suite the implemented features - Create New button functionality is changed - Add Question Handling featuresFlickr Wallpaper Rotator (for Windows desktop): Wallpaper Flickr 1.1: Some minor bugfixes (mostly covering when network connection is flakey, so I discovered them all while at my parents' house for Christmas).NoSimplerAccounting: NoSimplerAccounting 6.0: -Fixed a bug in expense category report.NHibernate Mapping Generator: NHibernate Mapping Generator 2.0: Added support for Postgres (Thanks to Angelo)NewLife XCode: XCode v6.5.2010.1223 ????(????v3.5??): XCode v6.5.2010.1223 ????,??: NewLife.Core ??? NewLife.Net ??? XControl ??? XTemplate ????,??C#?????? XAgent ???? NewLife.CommonEnitty ??????(???,XCode??????) XCode?? ?????????,??????????????????,?????95% XCode v3.5.2009.0714 ??,?v3.5?v6.0???????????????,?????????。v3.5???????????,??????????????。 XCoder ??XTemplate?????????,????????XCode??? XCoder_Src ???????(????XTemplate????),??????????????????New Projects1i2m3i4s5e6r7p: 1i2m3i4s5e6r7pA3 Fashion Web: A complete e-commerce siteAfonsoft Blog - Blog de teste: Blog para teste de desenvolvimento em MySQL ou SQLServer com ASP.NET 3.5AnyGrid for ASP.NET MVC: Which grid component should you use for your ASP.NET MVC project? How about all of them? AnyGrid makes it easy to switch between grid implementations, allowing a single action to, e.g., use two different grids for desktop and mobile views. It also supports DataAnnnotations.BizTalk Map Test Framework: The Map Test Framework makes it easier for BizTalk developers to test their maps. You'll no longer have to maintain a whole bunch of XML files for your tests. The use of template files and xpath queries to perform tests will increase your productivity tremendously.BlogResult.NET: A simple Blogging starter kit using S#harp Architecture. This project was originally course material sample code for an MVC class, but we decided to open source it and make it available to the community.Caro: Code demo caro game.Ecozany Skin for DotNetNuke: This Ecozany package contains three sample skins and a collection of containers for use in your DotNetNuke web sites.Enhanced lookup field: Enhanced lookup field makes it easier for end users to create filters. You'll no longer have to use a custom field or javascript to have a parent child lookup for example. It's developed in C#. - Parent/multiple child lookups - Advanced filters options - Easy to usejobglance: This is job siteMaxLeafWeb_K3: MaxLeafWeb_K3Miage Kart: Projet java M1msystem: MVC. Net web systemNetController: Controle sem teclas de direção, usando acelerômentro e .NET micro Framework.Pencils - A Blog Site framework: Another Blog site frameworkPICTShell: PICT is an efficient way to design test cases and test configurations for software systems. PICT Shell is the GUI for PICT.Portal de Solicitação de Serviços: O Portal de Solicitação de Serviços é uma solução que atende diversas empresas prestadores de serviços. Inicialmente desenvolvido para atender empresa de informática, mas o objetivo é que com a publicação do fonte, ele seja extendido à diversos tipos de prestadoras.Runery: Runery RSPSsCut4s: sCut4sSumacê Jogava Caxangá?: sumace makes it easier for gamers to play. You'll no longer have to play. It's developed in XNA.Supply_Chain_FInance: Supply Chain Finance thrives these years all over the world. In exploring the inner working mechanism ,we can sought to have a way to embed an information system in it. We develop the system to suit the domesticated supply chain finace .TestAmir: ??? ????? ??? ???The Cosmos: School project on a SOA based system (Loan system)Time zone: A c# library for manage time changes for any time zone in the world. Based on olson database.Validate: Validate is a collection of extension methods that let you add validations to any object and display validations in a user/developer friendly format. Its an extremely light weight validation framework with a zero learning curve.Windows Phone 7 Silverlight ListBox with CheckBox Control: A Windows Phone 7 ListBox control, providing CheckBoxes for item selection when in "Choose State" and no CheckBoxes in "Normal State", like the built-in Email app, with nice transition. To switch between states, set the "IsInChooseState" property. The control inherits ListBox.XQSOFT.WF.Designer: this is workflow designerzhanghai: my test project

    Read the article

  • Inside the Concurrent Collections: ConcurrentBag

    - by Simon Cooper
    Unlike the other concurrent collections, ConcurrentBag does not really have a non-concurrent analogy. As stated in the MSDN documentation, ConcurrentBag is optimised for the situation where the same thread is both producing and consuming items from the collection. We'll see how this is the case as we take a closer look. Again, I recommend you have ConcurrentBag open in a decompiler for reference. Thread Statics ConcurrentBag makes heavy use of thread statics - static variables marked with ThreadStaticAttribute. This is a special attribute that instructs the CLR to scope any values assigned to or read from the variable to the executing thread, not globally within the AppDomain. This means that if two different threads assign two different values to the same thread static variable, one value will not overwrite the other, and each thread will see the value they assigned to the variable, separately to any other thread. This is a very useful function that allows for ConcurrentBag's concurrency properties. You can think of a thread static variable: [ThreadStatic] private static int m_Value; as doing the same as: private static Dictionary<Thread, int> m_Values; where the executing thread's identity is used to automatically set and retrieve the corresponding value in the dictionary. In .NET 4, this usage of ThreadStaticAttribute is encapsulated in the ThreadLocal class. Lists of lists ConcurrentBag, at its core, operates as a linked list of linked lists: Each outer list node is an instance of ThreadLocalList, and each inner list node is an instance of Node. Each outer ThreadLocalList is owned by a particular thread, accessible through the thread local m_locals variable: private ThreadLocal<ThreadLocalList<T>> m_locals It is important to note that, although the m_locals variable is thread-local, that only applies to accesses through that variable. The objects referenced by the thread (each instance of the ThreadLocalList object) are normal heap objects that are not specific to any thread. Thinking back to the Dictionary analogy above, if each value stored in the dictionary could be accessed by other means, then any thread could access the value belonging to other threads using that mechanism. Only reads and writes to the variable defined as thread-local are re-routed by the CLR according to the executing thread's identity. So, although m_locals is defined as thread-local, the m_headList, m_nextList and m_tailList variables aren't. This means that any thread can access all the thread local lists in the collection by doing a linear search through the outer linked list defined by these variables. Adding items So, onto the collection operations. First, adding items. This one's pretty simple. If the current thread doesn't already own an instance of ThreadLocalList, then one is created (or, if there are lists owned by threads that have stopped, it takes control of one of those). Then the item is added to the head of that thread's list. That's it. Don't worry, it'll get more complicated when we account for the other operations on the list! Taking & Peeking items This is where it gets tricky. If the current thread's list has items in it, then it peeks or removes the head item (not the tail item) from the local list and returns that. However, if the local list is empty, it has to go and steal another item from another list, belonging to a different thread. It iterates through all the thread local lists in the collection using the m_headList and m_nextList variables until it finds one that has items in it, and it steals one item from that list. Up to this point, the two threads had been operating completely independently. To steal an item from another thread's list, the stealing thread has to do it in such a way as to not step on the owning thread's toes. Recall how adding and removing items both operate on the head of the thread's linked list? That gives us an easy way out - a thread trying to steal items from another thread can pop in round the back of another thread's list using the m_tail variable, and steal an item from the back without the owning thread knowing anything about it. The owning thread can carry on completely independently, unaware that one of its items has been nicked. However, this only works when there are at least 3 items in the list, as that guarantees there will be at least one node between the owning thread performing operations on the list head and the thread stealing items from the tail - there's no chance of the two threads operating on the same node at the same time and causing a race condition. If there's less than three items in the list, then there does need to be some synchronization between the two threads. In this case, the lock on the ThreadLocalList object is used to mediate access to a thread's list when there's the possibility of contention. Thread synchronization In ConcurrentBag, this is done using several mechanisms: Operations performed by the owner thread only take out the lock when there are less than three items in the collection. With three or greater items, there won't be any conflict with a stealing thread operating on the tail of the list. If a lock isn't taken out, the owning thread sets the list's m_currentOp variable to a non-zero value for the duration of the operation. This indicates to all other threads that there is a non-locked operation currently occuring on that list. The stealing thread always takes out the lock, to prevent two threads trying to steal from the same list at the same time. After taking out the lock, the stealing thread spinwaits until m_currentOp has been set to zero before actually performing the steal. This ensures there won't be a conflict with the owning thread when the number of items in the list is on the 2-3 item borderline. If any add or remove operations are started in the meantime, and the list is below 3 items, those operations try to take out the list's lock and are blocked until the stealing thread has finished. This allows a thread to steal an item from another thread's list without corrupting it. What about synchronization in the collection as a whole? Collection synchronization Any thread that operates on the collection's global structure (accessing anything outside the thread local lists) has to take out the collection's global lock - m_globalListsLock. This single lock is sufficient when adding a new thread local list, as the items inside each thread's list are unaffected. However, what about operations (such as Count or ToArray) that need to access every item in the collection? In order to ensure a consistent view, all operations on the collection are stopped while the count or ToArray is performed. This is done by freezing the bag at the start, performing the global operation, and unfreezing at the end: The global lock is taken out, to prevent structural alterations to the collection. m_needSync is set to true. This notifies all the threads that they need to take out their list's lock irregardless of what operation they're doing. All the list locks are taken out in order. This blocks all locking operations on the lists. The freezing thread waits for all current lockless operations to finish by spinwaiting on each m_currentOp field. The global operation can then be performed while the bag is frozen, but no other operations can take place at the same time, as all other threads are blocked on a list's lock. Then, once the global operation has finished, the locks are released, m_needSync is unset, and normal concurrent operation resumes. Concurrent principles That's the essence of how ConcurrentBag operates. Each thread operates independently on its own local list, except when they have to steal items from another list. When stealing, only the stealing thread is forced to take out the lock; the owning thread only has to when there is the possibility of contention. And a global lock controls accesses to the structure of the collection outside the thread lists. Operations affecting the entire collection take out all locks in the collection to freeze the contents at a single point in time. So, what principles can we extract here? Threads operate independently Thread-static variables and ThreadLocal makes this easy. Threads operate entirely concurrently on their own structures; only when they need to grab data from another thread is there any thread contention. Minimised lock-taking Even when two threads need to operate on the same data structures (one thread stealing from another), they do so in such a way such that the probability of actually blocking on a lock is minimised; the owning thread always operates on the head of the list, and the stealing thread always operates on the tail. Management of lockless operations Any operations that don't take out a lock still have a 'hook' to force them to lock when necessary. This allows all operations on the collection to be stopped temporarily while a global snapshot is taken. Hopefully, such operations will be short-lived and infrequent. That's all the concurrent collections covered. I hope you've found it as informative and interesting as I have. Next, I'll be taking a closer look at ThreadLocal, which I came across while analyzing ConcurrentBag. As you'll see, the operation of this class deserves a much closer look.

    Read the article

  • Using the HTML5 &lt;input type=&quot;file&quot; multiple=&quot;multiple&quot;&gt; Tag in ASP.NET

    - by Rick Strahl
    Per HTML5 spec the <input type="file" /> tag allows for multiple files to be picked from a single File upload button. This is actually a very subtle change that's very useful as it makes it much easier to send multiple files to the server without using complex uploader controls. Please understand though, that even though you can send multiple files using the <input type="file" /> tag, the process of how those files are sent hasn't really changed - there's still no progress information or other hooks that allow you to automatically make for a nicer upload experience without additional libraries or code. For that you will still need some sort of library (I'll post an example in my next blog post using plUpload). All the new features allow for is to make it easier to select multiple images from disk in one operation. Where you might have required many file upload controls before to upload several files, one File control can potentially do the job. How it works To create a file input box that allows with multiple file support you can simply do:<form method="post" enctype="multipart/form-data"> <label>Upload Images:</label> <input type="file" multiple="multiple" name="File1" id="File1" accept="image/*" /> <hr /> <input type="submit" id="btnUpload" value="Upload Images" /> </form> Now when the file open dialog pops up - depending on the browser and whether the browser supports it - you can pick multiple files. Here I'm using Firefox using the thumbnail preview I can easily pick images to upload on a form: Note that I can select multiple images in the dialog all of which get stored in the file textbox. The UI for this can be different in some browsers. For example Chrome displays 3 files selected as text next to the Browse… button when I choose three rather than showing any files in the textbox. Most other browsers display the standard file input box and display the multiple filenames as a comma delimited list in the textbox. Note that you can also specify the accept attribute in the <input> tag, which specifies a mime-type to specify what type of content to allow.Here I'm only allowing images (image/*) and the browser complies by just showing me image files to display. Likewise I could use text/* for all text formats registered on the machine or text/xml to only show XML files (which would include xml,xst,xsd etc.). Capturing Files on the Server with ASP.NET When you upload files to an ASP.NET server there are a couple of things to be aware of. When multiple files are uploaded from a single file control, they are assigned the same name. In other words if I select 3 files to upload on the File1 control shown above I get three file form variables named File1. This means I can't easily retrieve files by their name:HttpPostedFileBase file = Request.Files["File1"]; because there will be multiple files for a given name. The above only selects the first file. Instead you can only reliably retrieve files by their index. Below is an example I use in app to capture a number of images uploaded and store them into a database using a business object and EF 4.2.for (int i = 0; i < Request.Files.Count; i++) { HttpPostedFileBase file = Request.Files[i]; if (file.ContentLength == 0) continue; if (file.ContentLength > App.Configuration.MaxImageUploadSize) { ErrorDisplay.ShowError("File " + file.FileName + " is too large. Max upload size is: " + App.Configuration.MaxImageUploadSize); return View("UploadClassic",model); } var image = new ClassifiedsBusiness.Image(); var ms = new MemoryStream(16498); file.InputStream.CopyTo(ms); image.Entered = DateTime.Now; image.EntryId = model.Entry.Id; image.ContentType = "image/jpeg"; image.ImageData = ms.ToArray(); ms.Seek(0, SeekOrigin.Begin); // resize image if necessary and turn into jpeg Bitmap bmp = Imaging.ResizeImage(ms.ToArray(), App.Configuration.MaxImageWidth, App.Configuration.MaxImageHeight); ms.Close(); ms = new MemoryStream(); bmp.Save(ms,ImageFormat.Jpeg); image.ImageData = ms.ToArray(); bmp.Dispose(); ms.Close(); model.Entry.Images.Add(image); } This works great and also allows you to capture input from multiple input controls if you are dealing with browsers that don't support multiple file selections in the file upload control. The important thing here is that I iterate over the files by index, rather than using a foreach loop over the Request.Files collection. The files collection returns key name strings, rather than the actual files (who thought that was good idea at Microsoft?), and so that isn't going to work since you end up getting multiple keys with the same name. Instead a plain for loop has to be used to loop over all files. Another Option in ASP.NET MVC If you're using ASP.NET MVC you can use the code above as well, but you have yet another option to capture multiple uploaded files by using a parameter for your post action method.public ActionResult Save(HttpPostedFileBase[] file1) { foreach (var file in file1) { if (file.ContentLength < 0) continue; // do something with the file }} Note that in order for this to work you have to specify each posted file variable individually in the parameter list. This works great if you have a single file upload to deal with. You can also pass this in addition to your main model to separate out a ViewModel and a set of uploaded files:public ActionResult Edit(EntryViewModel model,HttpPostedFileBase[] uploadedFile) You can also make the uploaded files part of the ViewModel itself - just make sure you use the appropriate naming for the variable name in the HTML document (since there's Html.FileFor() extension). Browser Support You knew this was coming, right? The feature is really nice, but unfortunately not supported universally yet. Once again Internet Explorer is the problem: No shipping version of Internet Explorer supports multiple file uploads. IE10 supposedly will, but even IE9 does not. All other major browsers - Chrome, Firefox, Safari and Opera - support multi-file uploads in their latest versions. So how can you handle this? If you need to provide multiple file uploads you can simply add multiple file selection boxes and let people either select multiple files with a single upload file box or use multiples. Alternately you can do some browser detection and if IE is used simply show the extra file upload boxes. It's not ideal, but either one of these approaches makes life easier for folks that use a decent browser and leaves you with a functional interface for those that don't. Here's a UI I recently built as an alternate uploader with multiple file upload buttons: I say this is my 'alternate' uploader - for my primary uploader I continue to use an add-in solution. Specifically I use plUpload and I'll discuss how that's implemented in my next post. Although I think that plUpload (and many of the other packaged JavaScript upload solutions) are a better choice especially for large uploads, for simple one file uploads input boxes work well enough. The advantage of this solution is that it's very easy to handle on the server side. Any of the JavaScript controls require special handling for uploads which I'll also discuss in my next post.© Rick Strahl, West Wind Technologies, 2005-2012Posted in HTML5  ASP.NET  MVC   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Making Sense of ASP.NET Paths

    - by Renso
    Making Sense of ASP.NET Paths ASP.Net includes quite a plethora of properties to retrieve path information about the current request, control and application. There's a ton of information available about paths on the Request object, some of it appearing to overlap and some of it buried several levels down, and it can be confusing to find just the right path that you are looking for. To keep things straight I thought it a good idea to summarize the path options along with descriptions and example paths. I wrote a post about this a long time ago in 2004 and I find myself frequently going back to that page to quickly figure out which path I’m looking for in processing the current URL. Apparently a lot of people must be doing the same, because the original post is the second most visited even to this date on this blog to the tune of nearly 500 hits per day. So, I decided to update and expand a bit on the original post with a little more information and clarification based on the original comments. Request Object Paths Available Here's a list of the Path related properties on the Request object (and the Page object). Assume a path like http://www.west-wind.com/webstore/admin/paths.aspx for the paths below where webstore is the name of the virtual. Request Property Description and Value ApplicationPath Returns the web root-relative logical path to the virtual root of this app. /webstore/ PhysicalApplicationPath Returns local file system path of the virtual root for this app. c:\inetpub\wwwroot\webstore PhysicalPath Returns the local file system path to the current script or path. c:\inetpub\wwwroot\webstore\admin\paths.aspx Path FilePath CurrentExecutionFilePath All of these return the full root relative logical path to the script page including path and scriptname. CurrentExcecutionFilePath will return the ‘current’ request path after a Transfer/Execute call while FilePath will always return the original request’s path. /webstore/admin/paths.aspx AppRelativeCurrentExecutionFilePath Returns an ASP.NET root relative virtual path to the script or path for the current request. If in  a Transfer/Execute call the transferred Path is returned. ~/admin/paths.aspx PathInfo Returns any extra path following the script name. If no extra path is provided returns the root-relative path (returns text in red below). string.Empty if no PathInfo is available. /webstore/admin/paths.aspx/ExtraPathInfo RawUrl Returns the full root relative URL including querystring and extra path as a string. /webstore/admin/paths.aspx?sku=wwhelp40 Url Returns a fully qualified URL including querystring and extra path. Note this is a Uri instance rather than string. http://www.west-wind.com/webstore/admin/paths.aspx?sku=wwhelp40 UrlReferrer The fully qualified URL of the page that sent the request. This is also a Uri instance and this value is null if the page was directly accessed by typing into the address bar or using an HttpClient based Referrer client Http header. http://www.west-wind.com/webstore/default.aspx?Info Control.TemplateSourceDirectory Returns the logical path to the folder of the page, master or user control on which it is called. This is useful if you need to know the path only to a Page or control from within the control. For non-file controls this returns the Page path. /webstore/admin/ As you can see there’s a ton of information available there for each of the three common path formats: Physical Path is an OS type path that points to a path or file on disk. Logical Path is a Web path that is relative to the Web server’s root. It includes the virtual plus the application relative path. ~/ (Root-relative) Path is an ASP.NET specific path that includes ~/ to indicate the virtual root Web path. ASP.NET can convert virtual paths into either logical paths using Control.ResolveUrl(), or physical paths using Server.MapPath(). Root relative paths are useful for specifying portable URLs that don’t rely on relative directory structures and very useful from within control or component code. You should be able to get any necessary format from ASP.NET from just about any path or script using these mechanisms. ~/ Root Relative Paths and ResolveUrl() and ResolveClientUrl() ASP.NET supports root-relative virtual path syntax in most of its URL properties in Web Forms. So you can easily specify a root relative path in a control rather than a location relative path: <asp:Image runat="server" ID="imgHelp" ImageUrl="~/images/help.gif" /> ASP.NET internally resolves this URL by using ResolveUrl("~/images/help.gif") to arrive at the root-relative URL of /webstore/images/help.gif which uses the Request.ApplicationPath as the basepath to replace the ~. By convention any custom Web controls also should use ResolveUrl() on URL properties to provide the same functionality. In your own code you can use Page.ResolveUrl() or Control.ResolveUrl() to accomplish the same thing: string imgPath = this.ResolveUrl("~/images/help.gif"); imgHelp.ImageUrl = imgPath; Unfortunately ResolveUrl() is limited to WebForm pages, so if you’re in an HttpHandler or Module it’s not available. ASP.NET Mvc also has it’s own more generic version of ResolveUrl in Url.Decode: <script src="<%= Url.Content("~/scripts/new.js") %>" type="text/javascript"></script> which is part of the UrlHelper class. In ASP.NET MVC the above sort of syntax is actually even more crucial than in WebForms due to the fact that views are not referencing specific pages but rather are often path based which can lead to various variations on how a particular view is referenced. In a Module or Handler code Control.ResolveUrl() unfortunately is not available which in retrospect seems like an odd design choice – URL resolution really should happen on a Request basis not as part of the Page framework. Luckily you can also rely on the static VirtualPathUtility class: string path = VirtualPathUtility.ToAbsolute("~/admin/paths.aspx"); VirtualPathUtility also many other quite useful methods for dealing with paths and converting between the various kinds of paths supported. One thing to watch out for is that ToAbsolute() will throw an exception if a query string is provided and doesn’t work on fully qualified URLs. I wrote about this topic with a custom solution that works fully qualified URLs and query strings here (check comments for some interesting discussions too). Similar to ResolveUrl() is ResolveClientUrl() which creates a fully qualified HTTP path that includes the protocol and domain name. It’s rare that this full resolution is needed but can be useful in some scenarios. Mapping Virtual Paths to Physical Paths with Server.MapPath() If you need to map root relative or current folder relative URLs to physical URLs or you can use HttpContext.Current.Server.MapPath(). Inside of a Page you can do the following: string physicalPath = Server.MapPath("~/scripts/ww.jquery.js")); MapPath is pretty flexible and it understands both ASP.NET style virtual paths as well as plain relative paths, so the following also works. string physicalPath = Server.MapPath("scripts/silverlight.js"); as well as dot relative syntax: string physicalPath = Server.MapPath("../scripts/jquery.js"); Once you have the physical path you can perform standard System.IO Path and File operations on the file. Remember with physical paths and IO or copy operations you need to make sure you have permissions to access files and folders based on the Web server user account that is active (NETWORK SERVICE, ASPNET typically). Note the Server.MapPath will not map up beyond the virtual root of the application for security reasons. Server and Host Information Between these settings you can get all the information you may need to figure out where you are at and to build new Url if necessary. If you need to build a URL completely from scratch you can get access to information about the server you are accessing: Server Variable Function and Example SERVER_NAME The of the domain or IP Address wwww.west-wind.com or 127.0.0.1 SERVER_PORT The port that the request runs under. 80 SERVER_PORT_SECURE Determines whether https: was used. 0 or 1 APPL_MD_PATH ADSI DirectoryServices path to the virtual root directory. Note that LM typically doesn’t work for ADSI access so you should replace that with LOCALHOST or the machine’s NetBios name. /LM/W3SVC/1/ROOT/webstore Request.Url and Uri Parsing If you still need more control over the current request URL or  you need to create new URLs from an existing one, the current Request.Url Uri property offers a lot of control. Using the Uri class and UriBuilder makes it easy to retrieve parts of a URL and create new URLs based on existing URL. The UriBuilder class is the preferred way to create URLs – much preferable over creating URIs via string concatenation. Uri Property Function Scheme The URL scheme or protocol prefix. http or https Port The port if specifically specified. DnsSafeHost The domain name or local host NetBios machine name www.west-wind.com or rasnote LocalPath The full path of the URL including script name and extra PathInfo. /webstore/admin/paths.aspx Query The query string if any ?id=1 The Uri class itself is great for retrieving Uri parts, but most of the properties are read only if you need to modify a URL in order to change it you can use the UriBuilder class to load up an existing URL and modify it to create a new one. Here are a few common operations I’ve needed to do to get specific URLs: Convert the Request URL to an SSL/HTTPS link For example to take the current request URL and converted  it to a secure URL can be done like this: UriBuilder build = new UriBuilder(Request.Url); build.Scheme = "https"; build.Port = -1; // don't inject portUri newUri = build.Uri; string newUrl = build.ToString(); Retrieve the fully qualified URL without a QueryString AFAIK, there’s no native routine to retrieve the current request URL without the query string. It’s easy to do with UriBuilder however: UriBuilder builder = newUriBuilder(Request.Url); builder.Query = ""; stringlogicalPathWithoutQuery = builder.ToString();

    Read the article

  • TFS 2010 SDK: Smart Merge - Programmatically Create your own Merge Tool

    - by Tarun Arora
    Technorati Tags: Team Foundation Server 2010,TFS SDK,TFS API,TFS Merge Programmatically,TFS Work Items Programmatically,TFS Administration Console,ALM   The information available in the Merge window in Team Foundation Server 2010 is very important in the decision making during the merging process. However, at present the merge window shows very limited information, more that often you are interested to know the work item, files modified, code reviewer notes, policies overridden, etc associated with the change set. Our friends at Microsoft are working hard to change the game again with vNext, but because at present the merge window is a model window you have to cancel the merge process and go back one after the other to check the additional information you need. If you can relate to what i am saying, you will enjoy this blog post! I will show you how to programmatically create your own merging window using the TFS 2010 API. A few screen shots of the WPF TFS 2010 API – Custom Merging Application that we will be creating programmatically, Excited??? Let’s start coding… 1. Get All Team Project Collections for the TFS Server You can read more on connecting to TFS programmatically on my blog post => How to connect to TFS Programmatically 1: public static ReadOnlyCollection<CatalogNode> GetAllTeamProjectCollections() 2: { 3: TfsConfigurationServer configurationServer = 4: TfsConfigurationServerFactory. 5: GetConfigurationServer(new Uri("http://xxx:8080/tfs/")); 6: 7: CatalogNode catalogNode = configurationServer.CatalogNode; 8: return catalogNode.QueryChildren(new Guid[] 9: { CatalogResourceTypes.ProjectCollection }, 10: false, CatalogQueryOptions.None); 11: } 2. Get All Team Projects for the selected Team Project Collection You can read more on connecting to TFS programmatically on my blog post => How to connect to TFS Programmatically 1: public static ReadOnlyCollection<CatalogNode> GetTeamProjects(string instanceId) 2: { 3: ReadOnlyCollection<CatalogNode> teamProjects = null; 4: 5: TfsConfigurationServer configurationServer = 6: TfsConfigurationServerFactory.GetConfigurationServer(new Uri("http://xxx:8080/tfs/")); 7: 8: CatalogNode catalogNode = configurationServer.CatalogNode; 9: var teamProjectCollections = catalogNode.QueryChildren(new Guid[] {CatalogResourceTypes.ProjectCollection }, 10: false, CatalogQueryOptions.None); 11: 12: foreach (var teamProjectCollection in teamProjectCollections) 13: { 14: if (string.Compare(teamProjectCollection.Resource.Properties["InstanceId"], instanceId, true) == 0) 15: { 16: teamProjects = teamProjectCollection.QueryChildren(new Guid[] { CatalogResourceTypes.TeamProject }, false, 17: CatalogQueryOptions.None); 18: } 19: } 20: 21: return teamProjects; 22: } 3. Get All Branches with in a Team Project programmatically I will be passing the name of the Team Project for which i want to retrieve all the branches. When consuming the ‘Version Control Service’ you have the method QueryRootBranchObjects, you need to pass the recursion type => none, one, full. Full implies you are interested in all branches under that root branch. 1: public static List<BranchObject> GetParentBranch(string projectName) 2: { 3: var branches = new List<BranchObject>(); 4: 5: var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://<ServerName>:8080/tfs/<teamProjectName>")); 6: var versionControl = tfs.GetService<VersionControlServer>(); 7: 8: var allBranches = versionControl.QueryRootBranchObjects(RecursionType.Full); 9: 10: foreach (var branchObject in allBranches) 11: { 12: if (branchObject.Properties.RootItem.Item.ToUpper().Contains(projectName.ToUpper())) 13: { 14: branches.Add(branchObject); 15: } 16: } 17: 18: return branches; 19: } 4. Get All Branches associated to the Parent Branch Programmatically Now that we have the parent branch, it is important to retrieve all child branches of that parent branch. Lets see how we can achieve this using the TFS API. 1: public static List<ItemIdentifier> GetChildBranch(string parentBranch) 2: { 3: var branches = new List<ItemIdentifier>(); 4: 5: var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://<ServerName>:8080/tfs/<CollectionName>")); 6: var versionControl = tfs.GetService<VersionControlServer>(); 7: 8: var i = new ItemIdentifier(parentBranch); 9: var allBranches = 10: versionControl.QueryBranchObjects(i, RecursionType.None); 11: 12: foreach (var branchObject in allBranches) 13: { 14: foreach (var childBranche in branchObject.ChildBranches) 15: { 16: branches.Add(childBranche); 17: } 18: } 19: 20: return branches; 21: } 5. Get Merge candidates between two branches Programmatically Now that we have the parent and the child branch that we are interested to perform a merge between we will use the method ‘GetMergeCandidates’ in the namespace ‘Microsoft.TeamFoundation.VersionControl.Client’ => http://msdn.microsoft.com/en-us/library/bb138934(v=VS.100).aspx 1: public static MergeCandidate[] GetMergeCandidates(string fromBranch, string toBranch) 2: { 3: var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://<ServerName>:8080/tfs/<CollectionName>")); 4: var versionControl = tfs.GetService<VersionControlServer>(); 5: 6: return versionControl.GetMergeCandidates(fromBranch, toBranch, RecursionType.Full); 7: } 6. Get changeset details Programatically Now that we have the changeset id that we are interested in, we can get details of the changeset. The Changeset object contains the properties => http://msdn.microsoft.com/en-us/library/microsoft.teamfoundation.versioncontrol.client.changeset.aspx - Changes: Gets or sets an array of Change objects that comprise this changeset. - CheckinNote: Gets or sets the check-in note of the changeset. - Comment: Gets or sets the comment of the changeset. - PolicyOverride: Gets or sets the policy override information of this changeset. - WorkItems: Gets an array of work items that are associated with this changeset. 1: public static Changeset GetChangeSetDetails(int changeSetId) 2: { 3: var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://<ServerName>:8080/tfs/<CollectionName>")); 4: var versionControl = tfs.GetService<VersionControlServer>(); 5: 6: return versionControl.GetChangeset(changeSetId); 7: } 7. Possibilities In future posts i will try and extend this idea to explore further possibilities, but few features that i am sure will further help during the merge decision making process would be, - View changed files - Compare modified file with current/previous version - Merge Preview - Last Merge date Any other features that you can think of?

    Read the article

  • ODEE Green Field (Windows) Part 5 - Deployment and Validation

    - by AndyL-Oracle
    And here we are, almost finished with our installation of Oracle Documaker Enterprise Edition ("ODEE") in a Windows green field environment. Let's recap what we've done so far: In part 1, I went over the basic process that I intended to show with installing an ODEE on a green field server. I walked you through the basic installation of Oracle 11g database In part 2, I covered the installation of WebLogic application server. In part 3, I showed you how to install SOA Suite for WebLogic. In part 4, we did the first part of the installation of ODEE itself. What remains after all of that, is the deployment of the ODEE components onto the database and application server - so let's get to it! DATABASE First, we'll deploy the schemas to the database. The schemas are created during the ODEE installation according to the responses provided during the install process. To deploy the schemas, you'll need to login to the database server in your green field environment. Open a command line and CD into ODEE_HOME\documaker\database\oracle11g.Run SQLPLUS as SYSDBA and execute dmkr_admin.sql:  sqlplus / as sysdba @dmkr_admin.sql Execute dmkr_asline.sql, dmkr_admin_correspondence_example.sql.  If you require additional languages, run the appropriate SQL scripts (e.g. dmkr_asline_es.sql for Spanish). APPLICATION SERVER Next, we'll deploy the WebLogic domain and it's components - Documaker web services, Documaker Interactive, Documaker dashboard, and more. To deploy the components, you'll need to login to the application server in your green field environment. 1. Open Windows Explorer and navigate to ODEE_HOME\documaker\j2ee\weblogic\oracle11g\scripts.2. Using a text editor such as Notepad++, modify weblogic_installation_properties and set location of MIDDLEWARE_HOME and ODEE HOME. If you have used the defaults you’ll probably need to change the E: to C: and that’s it. Save the changes.3. Continuing in the same directory, use your text editor to modify set_middleware_env.cmd and set the drive and path to MIDDLEWARE_HOME. If you have used the defaults you’ll probably need to just change E: to C: and that’s it. Save the changes.4. In the same directory, execute wls_create_domain.cmd by double-clicking it. This should run to completion. If it does not, review any errors and correct them, and rerun the script.5. In the same directory, execute wls_add_correspondence.cmd by double-clicking it - again this should run to completion. 6. Next, we'll start the AdminServer - this is the main WebLogic domain server. To start it, use Windows Explorer and navigate to MIDDLEWARE_HOME\user_projects\domains\idocumaker_domain. Double-click startWebLogic.cmd and the server startup will begin. Once you see output that indicates that the server status changed to RUNNING you may proceed.  a. Note: if you saw database connection errors, you probably didn’t make sure your database name and connection type match. You can change this manually in the WebLogic Console. Open a browser and navigate to http://localhost:7001/console (replace localhost with the name of your application server host if you aren't opening the browser on the server), and login with the the weblogic credential you provided in the ODEE installation process. b. Once you're logged in, open Services?Data Sources. Select dmkr_admin and click Connection Pool.  c. The end of the URL should match the connection type you chose. If you chose ServiceName, the URL should be: jdbc:oracle:thin:@//<hostname>:1521/<serviceName> and if you chose SID, the URL should be: jdbc:oracle:thin:@//<hostname>:1521/<SIDname> d. An example serviceName is a fully qualified DNS-style name, e.g. "idmaker.us.oracle.com". (It does not need to actually resolve in DNS). An example SID is just a name, e.g. IDMAKER. e. Save the change and repeat for the data source dmkr_asline.  f. You will also need to make the same changes in the ODEE_HOME/documaker/docfactory/config/context/.bindings file - open the file in a text editor, locate the URL lines and make the appropriate change, then save the file.  7. Back in the ODEE_HOME\documaker\j2ee\weblogic\oracle11g\scripts directory, execute create_users_groups.cmd. 8. In the same directory, execute create_users_groups_correspondence_example.cmd. 9. Open a browser and navigate to http://localhost:7001/jpsquery. Replace localhost with the name of your application server host if you aren't running the browser on the application server. If you changed the default port for the AdminServer from 7001, use the port you changed it to. You should see output like this: 10. Start the WebLogic managed servers by opening a command prompt and navigating to MIDDLEWARE_HOME/user_projects/domains/idocumaker_domain/bin/. When you start the servers listed below, you will be prompted to enter the WebLogic credentials to start the server. You can prevent this by providing the credential in the startManagedwebLogic.cmd file for the WLS_USER and WLS_PASS values. Note that the credential will be stored in cleartext. To start the server, type in the command shown. a. Start the JMS Server: ./startManagedWebLogic.cmd jms_server b. Start Dashboard/Documaker Administrator: ./startManagedWebLogic.cmd dmkr_server c. Start Documaker Interactive for Correspondence: ./startManagedWebLogic.cmd idm_server SOA Composites  If you're planning on testing out the approval process components of BPEL that can be used with Documaker Interactive, then use the following steps to deploy the SOA composites. If you're not going to use BPEL, you can skip to the next section.1. Stop the servers listed in the previous section (Step 10) in the reverse order that they were started.2. Run the Domain configuration command: navigate to and execute MIDDLEWARE_HOME/wlserver_10.3/common/bin/config.cmd.3. Select Extend and click next. 4. Select the iDocumaker Domain and click Next. 5. Select the Oracle SOA Suite – 11.1.1.0 (this may automatically select other components which is OK). Click Next. 6. View the Configure JDBC resources screen. You should not make any changes. Click Next. 7. Check both connections and click Test Connections. After successful test, click Next. If the tests fail, something is broken. Go back to configure JDBC resources and check your service name/SID. 8. Check all schemas. Set a password (will be the same for all schemas). Enter the database information (service name, host name, port). Click Next. 9. Connections should test successfully. If not, go back and fix any errors. Click Next. 10. Click Next to pass through Optional Configuration. 11. Click Extend. 12. Click Done. 13. Open a terminal window and navigate to/execute: ODEE_HOME/documaker/j2ee/weblogic/oracle11g/bpel/antbuild.cmd14. Start the WebLogic Servers – AdminServer, jms_server, dmkr_server, idm_server. If you forgot how to do this, see the previous section Step 10. Note: if you previously changed the startManagedWebLogic.cmd script for WLS_USER and WLS_PASS you will need to make those changes again. 15. Start the WebLogic server soa_server1: MIDDLEWARE_HOME/user_projects/domains/idocumaker_domain/bin/startManagedWebLogic.cmd soa_server116. Open a browser to http://localhost:7001/console and login. 17. Navigate to Services?Data Sources and select DMKR_ASLINE. 18. Click the Targets tab. Check soa_server1, then click Save. Repeat for the DMKR_ADMIN data source. 19. Open a command prompt and navigate to ODEE_HOME/j2ee/weblogic/oracle11g/scripts, then execute deploy_soa.cmd. That's it! (As if that wasn't enough?) DOCUMAKER Deploy the sample MRL resources by navigating to/executing ODEE_HOME/documaker/mstrres/dmres/deploysamplemrl.bat. You should see approximately 500 resources deployed into the database. Start the Factory Services. Start?Run?services.msc. Locate the service named "ODDF xxxx" and right-click, select Start. Note that each Assembly Line has a separate Factory setup, including its own Factory service and Docupresentment service. The services are named for the assembly line and the machine on which they are installed (because you could have multiple machines servicing a single assembly line, so this allows for easy scripting to control all the services if you choose to do so. Repeat for the Docupresentment service. Note that each Assembly Line has a separate Docupresentment. Using Windows Explorer, navigate to ODEE_HOME/documaker/mstrres/dmres/input and select one of the XML files, and copy it into ODEE_HOME/documaker/hotdirectory. Note: if you chose a different hot directory during installation, copy the file there instead. Momentarily you should see the XML file disappear! Open browser and navigate to http://localhost:10001/DocumakerDashboard (previous versions 12.0-12.2 use http://localhost:10001/dashboard) and verify that job processed successfully. Note that some transactions may fail if you do not have a properly configured email server, and this is ok. You can set up a simple SMTP server (just search the internet for "SMTP developer" and you'll get several to choose from.  So... that's it? Where are we at this point? You now have a completely functional ODEE installation, from soup to nuts as they say. You can further expand your installation by doing some of the following activities: clustering WebLogic services configuring WebLogic for redundancy configuring Oracle 11g for RAC adding additional Factory servers for redundancy/processing capacity setting up a real MRL (instead of the sample resources) testing Documaker Web Services for job submission and more!  I certainly hope you've enjoyed this and find it useful. If you find yourself running into trouble, visit the Oracle Community for Documaker - there is plenty of activity there and you can ask questions. For more concentrated assistance, you can engage an Oracle consultant who is a subject matter expert to assist you. Feel free to email me [andy (dot) little (at) oracle (dot) com] and I can connect you with the appropriate resource to get started. Best of luck! -Andy 

    Read the article

  • Fixing a SkyDrive Sync Disaster

    - by Rick Strahl
    For a few months I've been using SkyDrive to handle some basic synching tasks for a number of folders of mine. Specifically I've been dumping a few of my development folders into sky drive so I have a live running backup. It had been working just fine until about a week ago when something went awry. Badly! The idea is that the SkyDrive should sync files, but somewhere in its sync relationship it appears that SkyDrive got confused and assumed it needed to sync back older files to my local machine from the SkyDrive server. So rather than syncing my newer files to the server SkyDrive was pushing older files back to me. Because SkyDrive is so slow actually updating data it's not unusual for SkyDrive to be far behind in syncing and apparently some files were out of date by several months. Of course this is insidious because I didn't notice it for quite some time. I'd been happily working away on my files when a few days ago I noted a bunch of files with -RasXps (my machine name) popping up in various folders. At first I thought my Git repository was giving me a fit, but eventually realized that SkyDrive was actually pushing old files into my monitored folders. To be fair SkyDrive did make backups of the existing files, but by the time I caught it there were literally a few thousand files scattered on my machine that were now updated with old files from online. Here's what some of this looks like: If you look at the directory list you see a bunch of files with a -RasXps postfix appended to them. Those are the files that SkyDrive replaced and backed up on my machine. As you can see the backed up files are actually newer than the ones it pulled from the online SkyDrive. Unless I modified the files after they were updated they all were older than the existing local files. Not exactly how I imagined my synching would work. At first I started cleaning up this mess manually. In most cases the obvious solution was to simply delete the original file and replace with the -RasXps file, but not in all files. Some scrutiny was required and besides being a pain in the ass to rename files, quite frequently I had to dig out Beyond Compare to compare a few files where it wasn't quite clear what's wrong. I quickly realized that doing this by hand would be too hard for the large number of files that got hosed. Hacking together a small .NET Utility So, I figured the easiest way to tackle this is to write a small utility app that shows me all the mangled files that have backups, allows me to compare them and then quickly select and update them, removing the -RasXps file after choosing one of the two files. What I ended up with was a quick and dirty WinForms app that allows me to pick a root folder, and then shows all the -MachineName files: I start by picking a base folder and a template to search for - typically the -MachineName. Clicking Go brings up a list of all files in that folder and its subdirectories.  The list also displays the dates for the saved (-MachineName) file and the current file on disk, along with highlighting for the newer of the two. I can right click on any file and get a context menu pop up to open the folder in Explorer, or open Beyond Compare and view the two files to compare differences which I found very helpful for a number of files where I had modified the files after SkyDrive had updated to an old one. Typically these would be the green files (of which there were thankfully few). To 'fix' files I can select any number of files in the list, then use one of the three buttons on the right to apply an operation. I can use the Saved files - that is the backup file that SkyDrive created with the -MachineName extension (-RasXps above). Or I can use the current file, which is the file with the right name on disk right now and delete the -MachineName file. Or on some occasions I can just opt to delete both of them. For some files like binaries it's often easier to just delete and them be rebuild than choosing. For the most part the process involves accepting the pink files, and checking the few green files and see if any modifications were made since the file was updated incorrectly by SkyDrive. For me luckily those are few in number. Anyways, I thought I share this utility in case anybody else runs into this issue. I've included the VS2012 solution and all the source code so you can see how it works and you can tweak it as needed. The .NET 4.5 binaries are also included if you can't compile. Be warned though!  This rough code is provided as is and makes no guarantees or claims about file safety. All three of the action buttons on the form will delete data. It's a very rough utility and there are no safeguards that ask nicely before deleting files. I highly recommend you make a backup before you have at it. This tools is very narrow in focus, but it might also work with other sync issues from other vendors. I seem to remember that I had similar issues with SugarSync at some point and it too created the -MachineName style files on sync conflicts. Hope this helps somebody out so you can avoid wasting the better part of a full work day on this… Resources Download the Source Code and Binaries for SkyDrive Rescue© Rick Strahl, West Wind Technologies, 2005-2013Posted in Windows  .NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • CodePlex Daily Summary for Tuesday, June 15, 2010

    CodePlex Daily Summary for Tuesday, June 15, 2010New ProjectsBackup on Build: Backup critical files on each Visual Studio build.CDN Support for EPiServer CMS: This module adds CDN support for EPiServer CMS by modifying outgoing links.Custom WCF Bindings: This project contains some custom WCF LOB SDK bindings I have created including a SalesForce one. I will blog on updates as they occur. Please se...DocIcon for SharePoint 2010: DocIcon for SharePoint re-enables links from document icons in SharePoint 2010. This feature was in previous versions of SharePoint, but was remove...EEG Peak Detection: EEG Peak Detectionemployeemanagement1: employeemanagement1Enables map services on top of existing map providers like Google Maps: Services include Map visualization services, Map decoration services, Spot registration services and Spot naming services.fbprivacy: Tool to assess your Facebook Privacy SettingsfMRI SVM Toolbox: fMRI SVM ToolboxGCMS – using .Net for human CMS: GCMS makes only what you need to do with a CMS and nothing more and it makes it with .NETqjblog: My First Blog.Send2Sharepoint: Office(Word,Excel,Outlook) and windows explorer addin to upload documents to sharepoint document library. SharePoint Find and Replace: SharePoint Find & Replace allows you to replace a specific string within a site collection with a different value. For example, when you change a l...SharePoint Management Studio: This project developed on Visula Studio 2008 and c# language. The main aim is manage your SharePoint 2007 FARM.SharePoint PageController: A SharePoint solution which provides an extensible framework to perform actions on a per-page basis in SharePoint. OOTB functionality allows for f...SilverNotePad: Simple notepad built using MVVM patern.SolidWorks Addin Development: The SolidWorks Addin Development project is dedicated to helping developers and non-developers with creating fully functional addins.Sunlit World Scheme: Sunlit World Scheme is a nearly R4RS-compliant Scheme implementation that supports threading, TCP, UDP, cryptography, and simple graphics and windo...TimeBend: Time tracking gone wild.TinyCMS: Jednostavan CMS s mogućnosti unosa vijesti, linkova i natječaja. CSS je napravljen tek toliko... Aplikacija izrađena za dev4Fun natjecanje.Ujimanet Android: text categorization tool for androidVisual Storm Engine: Visual Storm es un motor para probar nuevas tecnologias orientadas a la creacion de video juegos. Por ahora solo soporta Windows Vista/7 y usa Dire...New ReleasesAjax ASP.Net Forum: developer.insecla.com-forum_v0.1.4: *VERSION: 0.1.4* FEATURES ADDED Rating Threads (Through AjaxCTK (included Ms .DLL in the BIN folder)) Empowered Within AJAX Custom star ima...AlphaGet: Alpha 3: Important: from this release WinGet changes its name to AlphaGet in order to identify it better and make it search-engine friendly. New Features N...Backup on Build: Backup on Build v1.0.0 Initial Release: Initial Release version 1.0.0Boleto.Net: BoletoNet: Última versão estável da BoletoNet.dll. O código fonte dessa versão pode ser encontrado em http://boletonet.codeplex.com/SourceControl/list/change...CDN Support for EPiServer CMS: CDN Support v1: See links on start page for information on how to use and install this module.Chargify.NET: Chargify.NET v0.750: Adding support for creating freemium subscription plans Adding preliminary JSON support Adding ISO 3166-1 Alpha 2 data embedded in the library ...Community Forums NNTP bridge: Community Forums NNTP Bridge V38: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...ContainerOne - C# application server: V0.1.3.0: New minor release containing: Infrastructure - core service An installer of a windows service which provides the following: Service registry Even...DocIcon for SharePoint 2010: wwEsp.DocIcon Deployment Package Release 1.0.0: This package installs the DocIcon feature on a SharePoint 2010 server farm. The solution is deployed as a Farm-level feature that can be enabled or...Ethical Hacking ASP.NET: Version 1.2.0.0: For the complete list of changes, new features and fixes in the new version, please view the Version History page. Read more about the available te...Folder Bookmarks: Folder Bookmarks 1.6.3: The latest version of Folder Bookmarks (1.6.3), with new features and Mini-Menu UI Changes (1.4). Once you have extracted the file, do not delete ...Hades: Projet Hadès - Official Demo - Version 0.1.1 Beta: Second release correcting some bugs... ---------------------------------------------------------------------------- - Projet Hadès - Official Demo...KooBoo Image Gallery: RC 1: This new Version has this features 1) Refactoring to change the mispelled word galery to gallery 2) Change to use the plugin in the same page of ...LibWowArmory: LibWowArmory 0.3 beta: LibWowArmory 0.3 betaThis release of the LibWowArmory source code matches the WoW Armory as of version 3.3.3. Changes since version 0.2.3:Solution...MailChimp4Umbraco: 0.90 stable: Can be used in productionMapWindow6: MapWindow 6.0 June 14: This version adds the WebMercator projection and fixes a bug that was causing some perfect spheres to be created as oblate WGS1984 spheroids.MDownloader: MDownloader-0.15.18.59782: Supported FileServe. Supported SharingMatrix. Fixed minor bugs.MGM - MyGroupManager: MyGroupManager v0.1.5 - Alpha: At this point the application appears feature complete and works pretty well. The code still needs some tweaks (error handling), and a general look...MvcPager: MvcPager 1.4: MvcPager 1.4 source codes and demo projects MvcPager 1.4版源代码及示例文件Nito.LINQ: Beta (v0.6): Rx version The "with Rx" versions of Nito.LINQ are built against Rx 1.0.2563.0, released 2010-06-09. Supported Platforms .NET 4.0 Client Profile, ...open gaze and mouse analyzer: Ogama 3.3: This release was published on 14.06.2010 and is a bugfix release. For the list of changes please visit http://www.ogama.net. Only use this installe...patterns & practices: Prism: Prism 4.0 Drop 2: Prism 4.0 Drop 2 Welcome to the second drop of Prism 4.0 (formally known as the Composite Application Guidance for WPF and Silverlight). This drop ...Prism Software Factory Light: 0.5 Beta: 4ward Prism Software Factory Light - 0.5 Beta releaseThis is the first public beta release of the 4ward Prism Software Factory Light that allows to...PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT: PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT--3.3: Over the last several months, my primary research effort has been directed at producing strictly portable development methods between C and C# . T...qjblog: v1.source: v1.sourceqjblog: v1blog: V1 BlogQuick Performance Monitor: Version 1.4.2: Added 'Move to new window' functionality.Refix - .NET dependency management: Refix v0.1.0.90 ALPHA: Added console tree-style visualisation of solution dependencies, as well as some bug fixes. This version should work out of the box with the demons...SEMICO Framework: Version Stable 1.0.0.3: Version Stable 1.0.0.3SharePoint Find and Replace: 1.0.16: Version: 1.0.16 This release is the first stable release of this project, including the Microsoft public license agreement. Fixes: Added about dia...SharePoint Management Studio: v1: v1SharePoint PageController: SharePoint PageController: For SharePoint 2010 and 2007 running on IIS 7Software Is Hardwork: Sw. Is Hw. Lib. 3.0.0.x+06: Sw. Is Hw. Lib. 3.0.0.x+06SolidWorks Addin Development: GenericAddinFramework-06.14.2010: R1.SourceGrid: SourceGrid 4.30: Sources are here Note that SourceGrid sources are not hosted on CodePlex. The sources are hosted on bitbucket.org Main Changes Improved hidden ...SSIS Expression Editor & Tester: Expression Editor and Tester v1.0.2.0: Corrected release of expression editor tool, no changes to control. Download and extract the files to get started, no install required. Changes Co...Sunlit World Scheme: Sunlit World Scheme - 20100614 - source and binary: This is the result of building the current source code in Debug mode. The source code is included.TinyCMS: TinyCMS: Source kodVCC: Latest build, v2.1.30614.0: Automatic drop of latest buildVianaNET - Videoanalysis for physical motion: VianaNET 1.2 - beta: This is the VianaNET beta release with some bug fixes. Would like to have some comments on it. Regards, AdrianWorkLogger: Worklogger Beta 1: Simple work logger for Windows in WPFMost Popular ProjectsWBFS ManagerRawrAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryPHPExcelMicrosoft SQL Server Community & SamplesASP.NETMost Active Projectspatterns & practices – Enterprise LibraryRhyduino - Arduino and Managed CodeCassandraemonCommunity Forums NNTP bridgedotSpatialjQuery Library for SharePoint Web ServicesBlogEngine.NETLightweight Fluent WorkflowNB_Store - Free DotNetNuke Ecommerce Catalog ModuleUmbraco CMS

    Read the article

  • 24+ Coda Alternatives for Windows and Linux

    - by Matt
    Coda plays an important role in designing layout on Mac. There are numerous coda alternatives for windows and Linux too. It is not possible to describe each and everyone so some of the coda alternatives, which work on both windows and Linux platforms, are discussed below. EditPlus $35.00 Good thing about EditPlus is that it highlights URLs and email addresses, activating them when you ‘crtl + double-click’. It also has a built in browser for previewing HTML, and FTP and SFTP support. Also supports Macros and RegEx find and replace. UltraEdit $49.99 It is another good coda alternative for windows and Linux. It is the best suited editor for text, HTML and HEX. It also plays an advanced PHP, Perl, Java and JavaScript editor for programmers. It supports disk-based 64-bit or standard file handling on 32-bit Windows platforms or window 2000 and later versions. HippoEdit $39.95 HippoEDIT has the best autocomplete it gives pop a ‘tooltip’ above your cursor as you type, suggesting words you’ve already typed. It does syntax highlighting for over 2 dozen language. Sublime Text $59.00 Sublime Text awesome ‘zoomed out’ view of the file lets you focus on the area you want. It lets you open a local file when you right-click on its link, and there are a few automation features, so this would make a solid choice of a text editor. Textpad $24.70 TextPad is simple editor with nifty features such as column select, drag-and-drop text between files, and hyperlink support. It also supports large files. Aptana Free Aptana Studio is one of the best editors working on both windows and Linux. It is a complete web development setting that has a nice blend of powerful authoring tools with a collection of online hosting and collaboration services. It is quite helpful as it support for PHP, CSS, FTP, and more. SciTE Free It is a SCIntilla based Text Editor. It has gradually developed as a generally useful editor. It provides for building and running programs. It is best to be used for jobs with simple configurations. SciTE is currently available for Intel Win32 and Linux compatible operating systems with GTK+. It has been run on Windows XP and on Fedora 8 and Ubuntu 7.10 with GTK+ 2.12 E Text Editor $34.96 E Text Editor is a new text editor for Windows, which also works on Linux as well. It has powerful editing features and also some unique abilities. It makes text manipulation quite fast and easy, and makes user focus on his writing as it automatically does all the manual work. It can be extend it in any language. It supports Text Mate bundles, thus allows the user to tap into a huge and active community. Editra Free Editra is an upcoming editor, with some fantastic features such as user profiles, auto-completion, session saving, and syntax highlighing for 60+ languages. Plugins can extend the feature set, offering an integrated python console, FTP client, file browser, and calculator, among others. PSPad Free PSPad is a good Template for writing CSS, as it an internal web browser, and a macro recorder to the table. It also supports hex editing, and some degree of code compiling. JEdit Free It is a mature programmer’s text editor and has taken a good deal of time to be developed as it is today. It is better than many costlier development tools due to its features and simplicity of use. It has been released as free software with full source code, provided under the terms of the GPL 2.0. Which also adds to its attractiveness. NEdit Free It is a multi-purpose text editor for the X Window System, which also works on Linux. It combines a standard, easy to use, graphical user interface with the full functionality and stability required by users who edit text for long period a day. It also provides for thorough support for development in various languages. It also facilitates the use of text processors, and other tools at the same time. It can be used productively by anyone who needs to edit text. It is quite a user-friendly tool. Its salient features include syntax highlighting with built in pattern, auto indent, tab emulation, block indentation adjustment etc. As of version 5.1, NEdit may be freely distributed under the terms of the GNU General Public License. MadEdit Free Mad Edit is an Open-Source and Cross-Platform Text/Hex Editor. It is written in C++ and wxWidgets. MadEdit can edit files in Text/Column/Hex modes. It also supports many useful functions, such as Syntax Highlighting, Word Wrap, Encoding for UTF8/16/32,and others. It also supports word count, which makes it quite a useful text editor for both windows and Linux. It has been recently modified on 10/09/2010. KompoZer Free Kompozer is a complete web authoring system that has a combination of web file management and easy-to-use WYSIWYG web page editing. KompoZer has been designed to be completely and extensively easy to use. It is thus an ideal tool for non-technical computer users who want to create an attractive, professional-looking web site without knowing HTML or web coding. It is based on the NVU source code. Vim Free Vim or “Vi IMproved” is an advanced text editor. Its salient features are syntax highlighting, word completion and it also has a huge amount of contributed content. Vim has several “modes” on offer for editing, which adds to the efficiency in editing. Thus it becomes a non-user-friendly application but it is also strength for its users. The normal mode binds alphanumeric keys to task-oriented commands. The visual mode highlights text. More tools for search & replace, defining functions, etc. are offered through command line mode. Vim comes with complete help. NotePad ++ Free One of the the best free text editor for Windows out there; with support for simple things—like syntax highlighting and folding—all the way up to FTP, Notepad++ should tick most of the boxes Notepad2 Free Notepad2 is also based on the Scintilla editing engine, but it’s much simpler than Notepad++. It bills itself as being fast, light-weight, and Notepad-like. Crimson Editor Free Crimson Editor has the ability to edit remote files, using a built-in FTP client; there’s also a spell checker. TotalEdit Free TotalEdit allows file comparison, RegEx search and replace, and has multiple options for file backup / versioning. For cleanup, it offers (X)HTML and XML customizable formatting, and a spell checker. In-Type Free ConTEXT Free SourceEdit Free SourceEdit includes features such as clipboard history, syntax highlighting and autocompletion for a decent set of languages. A hex editor and FTP client. RJ TextED Free RJ TextED supports integration with TopStyle Lite. Provides HTML validation and formatting. It includes an FTP client, a file browser, and a code browser, as well as a character map and support for email. GEDIT Free It is one of the best coda alternatives for windows and Linux. It has syntax highlighting and is best suitable for programming. It has many attractive features such as full support for UTF-8, undo/redo, and clipboard support, search and replace, configurable syntax highlighting for various languages and many more supportive features. It is extensible with plug ins. Other important coda alternatives for windows and Linux are Redcar, Bluefish Editor, NVU, Ruby Mine, Slick Edit, Geany, Editra, txt2html and CSSED. There are many more. Its up to user to decide which one suits best to his requirements. Related posts:10 Useful Text Editor For Developer Applications to Install & Run Windows on Linux Open Source WYSIWYG Text Editors

    Read the article

  • Wrapping ASP.NET Client Callbacks

    - by Ricardo Peres
    Client Callbacks are probably the less known (and I dare say, less loved) of all the AJAX options in ASP.NET, which also include the UpdatePanel, Page Methods and Web Services. The reason for that, I believe, is it’s relative complexity: Get a reference to a JavaScript function; Dynamically register function that calls the above reference; Have a JavaScript handler call the registered function. However, it has some the nice advantage of being self-contained, that is, doesn’t need additional files, such as web services, JavaScript libraries, etc, or static methods declared on a page, or any kind of attributes. So, here’s what I want to do: Have a DOM element which exposes a method that is executed server side, passing it a string and returning a string; Have a server-side event that handles the client-side call; Have two client-side user-supplied callback functions for handling the success and error results. I’m going to develop a custom control without user interface that does the registration of the client JavaScript method as well as a server-side event that can be hooked by some handler on a page. My markup will look like this: 1: <script type="text/javascript"> 1:  2:  3: function onCallbackSuccess(result, context) 4: { 5: } 6:  7: function onCallbackError(error, context) 8: { 9: } 10:  </script> 2: <my:CallbackControl runat="server" ID="callback" SendAllData="true" OnCallback="OnCallback"/> The control itself looks like this: 1: public class CallbackControl : Control, ICallbackEventHandler 2: { 3: #region Public constructor 4: public CallbackControl() 5: { 6: this.SendAllData = false; 7: this.Async = true; 8: } 9: #endregion 10:  11: #region Public properties and events 12: public event EventHandler<CallbackEventArgs> Callback; 13:  14: [DefaultValue(true)] 15: public Boolean Async 16: { 17: get; 18: set; 19: } 20:  21: [DefaultValue(false)] 22: public Boolean SendAllData 23: { 24: get; 25: set; 26: } 27:  28: #endregion 29:  30: #region Protected override methods 31:  32: protected override void Render(HtmlTextWriter writer) 33: { 34: writer.AddAttribute(HtmlTextWriterAttribute.Id, this.ClientID); 35: writer.RenderBeginTag(HtmlTextWriterTag.Span); 36:  37: base.Render(writer); 38:  39: writer.RenderEndTag(); 40: } 41:  42: protected override void OnInit(EventArgs e) 43: { 44: String reference = this.Page.ClientScript.GetCallbackEventReference(this, "arg", "onCallbackSuccess", "context", "onCallbackError", this.Async); 45: String script = String.Concat("\ndocument.getElementById('", this.ClientID, "').callback = function(arg, context, onCallbackSuccess, onCallbackError){", ((this.SendAllData == true) ? "__theFormPostCollection.length = 0; __theFormPostData = ''; WebForm_InitCallback(); " : String.Empty), reference, ";};\n"); 46:  47: this.Page.ClientScript.RegisterStartupScript(this.GetType(), String.Concat("callback", this.ClientID), script, true); 48:  49: base.OnInit(e); 50: } 51:  52: #endregion 53:  54: #region Protected virtual methods 55: protected virtual void OnCallback(CallbackEventArgs args) 56: { 57: EventHandler<CallbackEventArgs> handler = this.Callback; 58:  59: if (handler != null) 60: { 61: handler(this, args); 62: } 63: } 64:  65: #endregion 66:  67: #region ICallbackEventHandler Members 68:  69: String ICallbackEventHandler.GetCallbackResult() 70: { 71: CallbackEventArgs args = new CallbackEventArgs(this.Context.Items["Data"] as String); 72:  73: this.OnCallback(args); 74:  75: return (args.Result); 76: } 77:  78: void ICallbackEventHandler.RaiseCallbackEvent(String eventArgument) 79: { 80: this.Context.Items["Data"] = eventArgument; 81: } 82:  83: #endregion 84: } And the event argument class: 1: [Serializable] 2: public class CallbackEventArgs : EventArgs 3: { 4: public CallbackEventArgs(String argument) 5: { 6: this.Argument = argument; 7: this.Result = String.Empty; 8: } 9:  10: public String Argument 11: { 12: get; 13: private set; 14: } 15:  16: public String Result 17: { 18: get; 19: set; 20: } 21: } You will notice two properties on the CallbackControl: Async: indicates if the call should be made asynchronously or synchronously (the default); SendAllData: indicates if the callback call will include the view and control state of all of the controls on the page, so that, on the server side, they will have their properties set when the Callback event is fired. The CallbackEventArgs class exposes two properties: Argument: the read-only argument passed to the client-side function; Result: the result to return to the client-side callback function, set from the Callback event handler. An example of an handler for the Callback event would be: 1: protected void OnCallback(Object sender, CallbackEventArgs e) 2: { 3: e.Result = String.Join(String.Empty, e.Argument.Reverse()); 4: } Finally, in order to fire the Callback event from the client, you only need this: 1: <input type="text" id="input"/> 2: <input type="button" value="Get Result" onclick="document.getElementById('callback').callback(callback(document.getElementById('input').value, 'context', onCallbackSuccess, onCallbackError))"/> The syntax of the callback function is: arg: some string argument; context: some context that will be passed to the callback functions (success or failure); callbackSuccessFunction: some function that will be called when the callback succeeds; callbackFailureFunction: some function that will be called if the callback fails for some reason. Give it a try and see if it helps!

    Read the article

  • Using SQL Source Control with Fortress or Vault &ndash; Part 2

    - by AjarnMark
    In Part 1, I started talking about using Red-Gate’s newest version of SQL Source Control and how I really like it as a viable method to source control your database development.  It looks like this is going to turn into a little series where I will explain how we have done things in the past, and how life is different with SQL Source Control.  I will also explain some of my philosophy and methodology around deployment with these tools.  But for now, let’s talk about some of the good and the bad of the tool itself. More Kudos and Features I mentioned previously how impressed I was with the responsiveness of Red-Gate’s team.  I have been having an ongoing email conversation with Gyorgy Pocsi, and as I have run into problems or requested things behave a little differently, it has not been more than a day or two before a new Build is ready for me to download and test.  Quite impressive! I’m sure much of the requests I put in were already in the plans, so I can’t really take credit for them, but throughout this conversation, Red-Gate has implemented several features that were not in the first Early Access version.  Those include: Honoring the Fortress configuration option to require Work Item (Bug) IDs on check-ins. Adding the check-in comment text as a comment to the Work Item. Adding the list of checked-in files, along with the Fortress links for automatic History and DIFF view Updating the status of a Work Item on check-in (e.g. setting the item to Complete or, in our case “Dev-Complete”) Support for the Fortress 2.0 API, and not just the Vault Pro 5.1 API.  (See later notes regarding support for Fortress 2.0). These were all features that I felt we really needed to have in-place before I could honestly consider converting my team to using SQL Source Control on a regular basis.  Now that I have those, my only excuse is not wanting to switch boats on the team mid-stream.  So when we wrap up our current release in a few weeks, we will make the jump.  In the meantime, I will continue to bang on it to make sure it is stable.  It passed one test for stability when I did a test load of one of our larger database schemas into Fortress with SQL Source Control.  That database has about 150 tables, 200 User-Defined Functions and nearly 900 Stored Procedures.  The initial load to source control went smoothly and took just a brief amount of time. Warnings Remember that this IS still in pre-release stage and while I have not had any problems after that first hiccup I wrote about last time, you still need to treat it with a healthy respect.  As I understand it, the RTM is targeted for February.  There are a couple more features that I hope make it into the final release version, but if not, they’ll probably be coming soon thereafter.  Those are: A Browse feature to let me lookup the Work Item ID instead of having to remember it or look back in my Item details.  This is just a matter of convenience. I normally have my Work Item list open anyway, so I can easily look it up, but hey, why not make it even easier. A multi-line comment area.  The current space for writing check-in comments is a single-line text box.  I would like to have a multi-line space as I sometimes write lengthy commentary.  But I recognize that it is a struggle to get most developers to put in more than the word “fixed” as their comment, so this meets the need of the majority as-is, and it’s not a show-stopper for us. Merge.  SQL Source Control currently does not have a Merge feature.  If two or more people make changes to the same database object, you will get a warning of the conflict and have to choose which one wins (and then manually edit to include the others’ changes).  I think it unlikely you will run into actual conflicts in Stored Procedures and Functions, but you might with Views or Tables.  This will be nice to have, but I’m not losing any sleep over it.  And I have multiple tools at my disposal to do merges manually, so really not a show-stopper for us. Automation has its limits.  As cool as this automation is, it has its limits and there are some changes that you will be better off scripting yourself.  For example, if you are refactoring table definitions, and want to change a column name, you can write that as a quick sp_rename command and preserve the data within that column.  But because this tool is looking just at a before and after picture, it cannot tell that you just renamed a column.  To the tool, it looks like you dropped one column and added another.  This is not a knock against Red-Gate.  All automated scripting tools have this issue, unless the are actively monitoring your every step to know exactly what you are doing.  This means that when you go to Deploy your changes, SQL Compare will script the change as a column drop and add, or will attempt to rebuild the entire table.  Unfortunately, neither of these approaches will preserve the existing data in that column the way an sp_rename will, and so you are better off scripting that change yourself.  Thankfully, SQL Compare will produce warnings about the potential loss of data before it does the actual synchronization and give you a chance to intercept the script and do it yourself. Also, please note that the current official word is that SQL Source Control supports Vault Professional 5.1 and later.  Vault Professional is the new name for what was previously known as Fortress.  (You can read about the name change on SourceGear’s site.)  The last version of Fortress was 2.x, and the API for Fortress 2.x is different from the API for Vault Pro.  At my company, we are currently running Fortress 2.0, with plans to upgrade to Vault Pro early next year.  Gyorgy was able to come up with a work-around for me to be able to use SQL Source Control with Fortress 2.0, even though it is not officially supported.  If you are using Fortress 2.0 and want to use SQL Source Control, be aware that this is not officially supported, but it is working for us, and you can probably get the work-around instructions from Red-Gate if you’re really, really nice to them. Upcoming Topics Some of the other topics I will likely cover in this series over the next few weeks are: How we used to do source control back in the old days (a few weeks ago) before SQL Source Control was available to Vault users What happens when you restore a database that is linked to source control Handling multiple development branches of source code Concurrent Development practices and handling Conflicts Deployment Tips and Best Practices A recap after using the tool for a while

    Read the article

  • 2-Bay External HDD Enclosure in JBOD mode fails to detect both drives (Linux & Windows)

    - by mgc8888
    I recently purchased a couple of USB 3.0 External HDD Enclosures to use for storage and backup; the idea was to have one act as backup to the other, with 4 x 3TB drives in total. However, the second drive in each is not accessible in either Linux nor Windows, and I could not determine the reason. 1. Situation The two enclosures are slightly different (couldn't find them in stock at the same time) yet from many little details appear to be the same Chinese base design with a tweaked outer shell. The models are: Sharkoon 2-Bay RAID Box Fantec MR-35DU3 The drives are Seagate 3TB Barracuda ST33000651AS, firmware CC44, all identical. From reading manuals and online sources, I determined that JBOD would be the optimal setup for my needs -- addressing the two drives separately in each enclosure would be important, making it easy to swap drives and mix&match them if needed; all the other modes implied the controller doing a combination of the drives. The software used was Debian GNU/Linux - testing/wheezy - kernel 2.6.39-2 and Windows 7 Ultimate. 2. Description of the problem Now, here comes the problem: every time I connect either of the enclosures to a PC using the supplied cable (tried a different one as well), only the HDD in the top bay is readable, the one below is detected yet errors out in various ways. According to the manuals, it should not happen: in JBOD, the system should be able to "see" two separate drives upon connection. This happens with both enclosures and any combination of HDDs (i.e. if I swap them, the same thing happens), so the HDDs are good and I think so are the enclosures (two different companies making similar products that failed in an identical fashion would be very unlikely). The top HDD can be used fine every time, I actually tried a speed test from Linux and got about 150MiB/s reads, so all is working as it should; the one below refuses to work every time. So the failure is consistent. To make sure this was not some obscure Linux bug, I tried the same under Windows 7, and the system also only created one drive letter for a drive of 3TB size (so it was only seeing one instead of both). Placing an older, known good, 2TB drive in the top bay made that the one recognised, so we have the same issue under Windows as well. Log entries under Linux (tested here with a 3TB and a 2TB drive so I could differentiate them; either one works in the top enclosure, in the test setup the 3TB one is on top). You can see them being detected, the top one is ok, but for the bottom one only errors: Jul 19 23:28:15 media kernel: [260150.582436] usb 6-1: New USB device found, idVendor=1ca1, idProduct=18ae Jul 19 23:28:15 media kernel: [260150.582440] usb 6-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3 Jul 19 23:28:15 media kernel: [260150.582442] usb 6-1: Product: Usb Sata Bridge Jul 19 23:28:15 media kernel: [260150.582444] usb 6-1: Manufacturer: SYMWAVE Jul 19 23:28:15 media kernel: [260150.582446] usb 6-1: SerialNumber: 39584B304C4E3441 Jul 19 23:28:15 media kernel: [260150.870412] scsi11 : usb-storage 6-1:1.0 Jul 19 23:28:16 media kernel: [260151.882087] scsi 11:0:0:0: Direct-Access SYMWAVE ST33000651AS CC44 PQ: 0 ANSI: 4 Jul 19 23:28:16 media kernel: [260151.882242] scsi 11:0:0:1: Direct-Access SYMWAVE ST32000641AS CC12 PQ: 0 ANSI: 4 Jul 19 23:28:16 media kernel: [260151.882677] sd 11:0:0:0: Attached scsi generic sg2 type 0 Jul 19 23:28:16 media kernel: [260151.882774] sd 11:0:0:0: [sdb] Very big device. Trying to use READ CAPACITY(16). Jul 19 23:28:16 media kernel: [260151.882857] sd 11:0:0:1: Attached scsi generic sg3 type 0 Jul 19 23:28:16 media kernel: [260151.882893] sd 11:0:0:0: [sdb] 5860533168 512-byte logical blocks: (3.00 TB/2.72 TiB) Jul 19 23:28:16 media kernel: [260151.883085] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.883582] sd 11:0:0:0: [sdb] Write Protect is off Jul 19 23:28:16 media kernel: [260151.883961] sd 11:0:0:1: [sdc] 3907029168 512-byte logical blocks: (2.00 TB/1.81 TiB) Jul 19 23:28:16 media kernel: [260151.884145] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.884570] sd 11:0:0:1: [sdc] Write Protect is off Jul 19 23:28:16 media kernel: [260151.884855] sd 11:0:0:0: [sdb] Very big device. Trying to use READ CAPACITY(16). Jul 19 23:28:16 media kernel: [260151.885286] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.885807] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.909595] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.910159] sd 11:0:0:1: [sdc] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE Jul 19 23:28:16 media kernel: [260151.910163] sd 11:0:0:1: [sdc] Sense Key : Illegal Request [current] Jul 19 23:28:16 media kernel: [260151.910167] Info fld=0x0 Jul 19 23:28:16 media kernel: [260151.910169] sd 11:0:0:1: [sdc] Add. Sense: Invalid field in cdb Jul 19 23:28:16 media kernel: [260151.910172] sd 11:0:0:1: [sdc] CDB: Read(10): 28 20 00 00 00 00 00 00 08 00 Jul 19 23:28:16 media kernel: [260151.910182] quiet_error: 2 callbacks suppressed Jul 19 23:28:16 media kernel: [260151.910570] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.911153] sd 11:0:0:1: [sdc] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE Jul 19 23:28:16 media kernel: [260151.911156] sd 11:0:0:1: [sdc] Sense Key : Illegal Request [current] Jul 19 23:28:16 media kernel: [260151.911159] Info fld=0x0 Jul 19 23:28:16 media kernel: [260151.911161] sd 11:0:0:1: [sdc] Add. Sense: Invalid field in cdb Jul 19 23:28:16 media kernel: [260151.911164] sd 11:0:0:1: [sdc] CDB: Read(10): 28 20 00 00 00 00 00 00 08 00 Jul 19 23:28:16 media kernel: [260151.911385] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.911902] sd 11:0:0:1: [sdc] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE Jul 19 23:28:16 media kernel: [260151.911905] sd 11:0:0:1: [sdc] Sense Key : Illegal Request [current] Jul 19 23:28:16 media kernel: [260151.911908] Info fld=0x0 Jul 19 23:28:16 media kernel: [260151.911910] sd 11:0:0:1: [sdc] Add. Sense: Invalid field in cdb Jul 19 23:28:16 media kernel: [260151.911913] sd 11:0:0:1: [sdc] CDB: Read(10): 28 20 00 00 00 00 00 00 08 00 Jul 19 23:28:16 media kernel: [260151.912128] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.912650] sd 11:0:0:1: [sdc] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE Jul 19 23:28:16 media kernel: [260151.912653] sd 11:0:0:1: [sdc] Sense Key : Illegal Request [current] Jul 19 23:28:16 media kernel: [260151.912656] Info fld=0x0 Jul 19 23:28:16 media kernel: [260151.912657] sd 11:0:0:1: [sdc] Add. Sense: Invalid field in cdb Jul 19 23:28:16 media kernel: [260151.912660] sd 11:0:0:1: [sdc] CDB: Read(10): 28 20 00 00 00 00 00 00 08 00 Jul 19 23:28:16 media kernel: [260151.912876] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.913439] sd 11:0:0:1: [sdc] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE Jul 19 23:28:16 media kernel: [260151.913442] sd 11:0:0:1: [sdc] Sense Key : Illegal Request [current] Jul 19 23:28:16 media kernel: [260151.913445] Info fld=0x0 Jul 19 23:28:16 media kernel: [260151.913446] sd 11:0:0:1: [sdc] Add. Sense: Invalid field in cdb Jul 19 23:28:16 media kernel: [260151.913449] sd 11:0:0:1: [sdc] CDB: Read(10): 28 20 00 00 00 00 00 00 08 00 Jul 19 23:28:16 media kernel: [260151.945227] xhci_hcd 0000:03:00.0: WARN: Stalled endpoint Jul 19 23:28:16 media kernel: [260151.945863] sd 11:0:0:1: [sdc] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE Jul 19 23:28:16 media kernel: [260151.945866] sd 11:0:0:1: [sdc] Sense Key : Illegal Request [current] Jul 19 23:28:16 media kernel: [260151.945870] Info fld=0x0 Jul 19 23:28:16 media kernel: [260151.945871] sd 11:0:0:1: [sdc] Add. Sense: Invalid field in cdb Jul 19 23:28:16 media kernel: [260151.945875] sd 11:0:0:1: [sdc] CDB: Read(10): 28 20 00 00 00 00 00 00 08 00 (...) and so on for like 10 seconds until it gives up (...) 3. Question So, my question would be: what is causing this? Am I missing something, should I configure things differently, is this a known limitation? Searching online for more information did not yield any useful results... Thank you in advance for any help!

    Read the article

  • Portal And Content - Content Integration - Best Practices

    - by Stefan Krantz
    Lately we have seen an increase in projects that have failed to either get user friendly content integration or non satisfactory performance. Our intention is to mitigate any knowledge gap that our previous post might have left you with, therefore this post will repeat some recommendation or reference back to old useful post. Moreover this post will help you understand ground up how to design, architect and implement business enabled, responsive and performing portals with complex requirements on business centric information publishing. Design the Information Model The key to successful portal deployments is Information modeling, it's a key task to understand the use case you designing for, therefore I have designed a set of question you need to ask yourself or your customer: Question: Who will own the content, IT or Business? Answer: BusinessQuestion: Who will publish the content, IT or Business? Answer: BusinessQuestion: Will there be multiple publishers? Answer: YesQuestion: Are the publishers computer scientist?Answer: NoQuestion: How often do the information changes, daily, weekly, monthly?Answer: Daily, weekly If your answers to the questions matches at least 2, we strongly recommend you design your content with following principles: Divide your pages in to logical sections, where each section is marked with its purpose Assign capabilities to each section, does it contain text, images, formatting and/or is it static and is populated through other contextual information Select editor/design element type WYSIWYG - Rich Text Plain Text - non-format text Image - Image object Static List - static list of formatted informationDynamic Data List - assembled information from multiple data files through CMIS query The result of such design map could look like following below examples: Based on the outcome of the required elements in the design column 3 from the left you will now simply design a data model in WebCenter Content - Site Studio by creating a Region Definition structure matching your design requirements.For more information on how to create a Region definition see following post: Region Definition Post - note see instruction 7 for details. Each region definition can now be used to instantiate data files, a data file will hold the actual data for each element in the region definition. Another way you can see this is to compare the region definition as an extension to the metadata model in WebCenter Content for each data file item. Design content templates With a solid dependable information model we can now proceed to template creation and page design, in this phase focuses on how to place the content sections from the region definition on the page via a Content Presenter template. Remember by creating content presenter templates you will leverage the latest and most integrated technology WebCenter has to offer. This phase is much easier since the you already have the information model and design wire-frames to base the logic on, however there is still few considerations to pay attention to: Base the template on ADF and make only necessary exceptions to markup when required Leverage ADF design components for Tabs, Accordions and other similar components, this way the design in the content published areas will comply with other design areas based on custom ADF taskflows There is no performance impact when using meta data or region definition based data All data access regardless of type, metadata or xml data it can be accessed via the Content Presenter - Node. See below for applied examples on how to access data Access metadata property from Document - #{node.propertyMap['myProp'].value}myProp in this example can be for instance (dDocName, dDocTitle, xComments or any other available metadata) Access element data from data file xml - #{node.propertyMap['[Region Definition Name]:[Element name]'].asTextHtml}Region Definition Name is the expect region definition that the current data file is instantiatingElement name is the element value you like to grab from the data file I recommend you read following  useful post on content template topic:CMIS queries and template creation - note see instruction 9 for detailsStatic List template rendering For more information on templates:Single Item Content TemplateMulti Item Content TemplateExpression Language Internationalization Considerations When integrating content assets via content presenter you by now probably understand that the content item/data file is wired to the page, what is also pretty common at this stage is that the content item/data file only support one language since its not practical or business friendly to mix that into a complex structure. Therefore you will be left with a very common dilemma that you will have to either build a complete new portal for each locale, which is not an good option! However with little bit of information modeling and clear naming convention this can be addressed. Basically you can simply make sure that all content item/data file are named with a predictable naming convention like "Content1_EN" for the English rendition and "Content1_ES" for the Spanish rendition. This way through simple none complex customizations you will be able to dynamically switch the actual content item/data file just before rendering. By following proposed approach above you not only enable a simple mechanism for internationalized content you also preserve the functionality in the content presenter to support business accessible run-time publishing of information on existing and new pages. I recommend you read following useful post on Internationalization topics:Internationalize with Content Presenter Integrate with Review & Approval processes Today the Review and approval functionality and configuration is based out of WebCenter Content - Criteria Workflows. Criteria Workflows uses the metadata of the checked in document to evaluate if the document is under any review/approval process. So for instance if a Criteria Workflow is configured to force any documents with Version = "2" or "higher" and Content Type is "Instructions", any matching content item version on check in will now enter the workflow before getting released for general access. Few things to consider when configuring Criteria Workflows: Make sure to not trigger on version one for Content Items that are Data Files - if you trigger on version 1 you will not only approve an empty document you will also have a content presenter pointing to a none existing document - since the document will only be available after successful completion of the workflow Approval workflows sometimes requires more complex criteria, the recommendation if that is the case is that the meta data triggering such criteria is automatically populated, this can be achieved through many approaches including Content Profiles Criteria workflows are configured and managed in WebCenter Content Administration Applets where you can configure one or more workflows. When you configured Criteria workflows the Content Presenter will support the editors with the approval process directly inline in the "Contribution mode" of the portal. In addition to approve/reject and details of the task, the content presenter natively support the user to view the current and future version of the change he/she is approving. See below for example: Architectural recommendation To support review&approval processes - minimize the amount of data files per page Each CMIS query can consume significant time depending on the complexity of the query - minimize the amount of CMIS queries per page Use Content Presenter Templates based on ADF - this way you minimize the design considerations and optimize the usage of caching Implement the page in as few Data files as possible - simplifies publishing process, increases performance and simplifies release process Named data file (node) or list of named nodes when integrating to pages increases performance vs. querying for data Named data file (node) or list of named nodes when integrating to pages enables business centric page creation and publishing and reduces the need for IT department interaction Summary Just because one architectural decision solves a business problem it doesn't mean its the right one, when designing portals all architecture has to be in harmony and not impacting each other. For instance the most technical complex solution is not always the best since it will most likely defeat the business accessibility, performance or both, therefore the best approach is to first design for simplicity that even a non-technical user can operate, after that consider the performance impact and final look at the technology challenges these brings and workaround them first with out-of-the-box features, after that design and develop functions to complement the short comings.

    Read the article

  • Problems with capture TV card

    - by user8270
    Hi, I have a TV card that I have not managed to install with Ubuntu 10.10 i386. I have tried various topics in various forums and I could not install it. I hope you can help me to install it thank you. lspci 01:07.0 Multimedia controller: Philips Semiconductors SAA7130 Video Broadcast Decoder (rev 01) dmesg [10299.516344] saa7134 ALSA driver for DMA sound unloaded [11385.340661] Linux video capture interface: v2.00 [11385.384278] saa7130/34: v4l2 driver version 0.2.16 loaded [11385.384390] saa7130[0]: found at 0000:01:07.0, rev: 1, irq: 17, latency: 32, mmio: 0x0 [11385.384403] saa7130[0]: subsystem: 1131:0000, board: LifeView/Typhoon FlyVIDEO2000 [card=3,insmod option] [11385.384412] saa7130[0]: can't get MMIO memory @ 0x0 [11385.384431] saa7134: probe of 0000:01:07.0 failed with error -16 [11385.401174] saa7134 ALSA driver for DMA sound loaded [11385.401182] saa7134 ALSA: no saa7134 cards found [11477.797019] tvtime[12534]: segfault at 6b0 ip 0804cf64 sp bf928a4c error 4 in tvtime[8048000+76000] [11626.141821] tvtime[12549]: segfault at 6b0 ip 0804cf64 sp bfec357c error 4 in tvtime[8048000+76000] [12218.120632] saa7134 ALSA driver for DMA sound unloaded [12464.993061] Linux video capture interface: v2.00 [12465.028285] saa7130/34: v4l2 driver version 0.2.16 loaded [12465.028392] saa7130[0]: found at 0000:01:07.0, rev: 1, irq: 17, latency: 32, mmio: 0x0 [12465.028404] saa7134: <rant> [12465.028406] saa7134: Congratulations! Your TV card vendor saved a few [12465.028408] saa7134: cents for a eeprom, thus your pci board has no [12465.028411] saa7134: subsystem ID and I can't identify it automatically [12465.028414] saa7134: </rant> [12465.028416] saa7134: I feel better now. Ok, here are the good news: [12465.028418] saa7134: You can use the card=<nr> insmod option to specify [12465.028421] saa7134: which board do you have. The list: [12465.028428] saa7134: card=0 -> UNKNOWN/GENERIC [12465.028435] saa7134: card=1 -> Proteus Pro [philips reference design] 1131:2001 1131:2001 [12465.028447] saa7134: card=2 -> LifeView FlyVIDEO3000 5168:0138 4e42:0138 [12465.028457] saa7134: card=3 -> LifeView/Typhoon FlyVIDEO2000 5168:0138 4e42:0138 [12465.028467] saa7134: card=4 -> EMPRESS 1131:6752 [12465.028475] saa7134: card=5 -> SKNet Monster TV 1131:4e85 [12465.028484] saa7134: card=6 -> Tevion MD 9717 [12465.028491] saa7134: card=7 -> KNC One TV-Station RDS / Typhoon TV Tune 1131:fe01 1894:fe01 [12465.028501] saa7134: card=8 -> Terratec Cinergy 400 TV 153b:1142 [12465.028510] saa7134: card=9 -> Medion 5044 [12465.028517] saa7134: card=10 -> Kworld/KuroutoShikou SAA7130-TVPCI [12465.028523] saa7134: card=11 -> Terratec Cinergy 600 TV 153b:1143 [12465.028532] saa7134: card=12 -> Medion 7134 16be:0003 16be:5000 [12465.028542] saa7134: card=13 -> Typhoon TV+Radio 90031 [12465.028548] saa7134: card=14 -> ELSA EX-VISION 300TV 1048:226b [12465.028557] saa7134: card=15 -> ELSA EX-VISION 500TV 1048:226a [12465.028565] saa7134: card=16 -> ASUS TV-FM 7134 1043:4842 1043:4830 1043:4840 [12465.028576] saa7134: card=17 -> AOPEN VA1000 POWER 1131:7133 [12465.028585] saa7134: card=18 -> BMK MPEX No Tuner [12465.028592] saa7134: card=19 -> Compro VideoMate TV 185b:c100 [12465.028600] saa7134: card=20 -> Matrox CronosPlus 102b:48d0 [12465.028608] saa7134: card=21 -> 10MOONS PCI TV CAPTURE CARD 1131:2001 [12465.028617] saa7134: card=22 -> AverMedia M156 / Medion 2819 1461:a70b [12465.028625] saa7134: card=23 -> BMK MPEX Tuner [12465.028632] saa7134: card=24 -> KNC One TV-Station DVR 1894:a006 [12465.028640] saa7134: card=25 -> ASUS TV-FM 7133 1043:4843 [12465.028648] saa7134: card=26 -> Pinnacle PCTV Stereo (saa7134) 11bd:002b [12465.028657] saa7134: card=27 -> Manli MuchTV M-TV002 [12465.028663] saa7134: card=28 -> Manli MuchTV M-TV001 [12465.028670] saa7134: card=29 -> Nagase Sangyo TransGear 3000TV 1461:050c [12465.028679] saa7134: card=30 -> Elitegroup ECS TVP3XP FM1216 Tuner Card( 1019:4cb4 [12465.028687] saa7134: card=31 -> Elitegroup ECS TVP3XP FM1236 Tuner Card 1019:4cb5 [12465.028695] saa7134: card=32 -> AVACS SmartTV [12465.028702] saa7134: card=33 -> AVerMedia DVD EZMaker 1461:10ff [12465.028710] saa7134: card=34 -> Noval Prime TV 7133 [12465.028717] saa7134: card=35 -> AverMedia AverTV Studio 305 1461:2115 [12465.028725] saa7134: card=36 -> UPMOST PURPLE TV 12ab:0800 [12465.028734] saa7134: card=37 -> Items MuchTV Plus / IT-005 [12465.028740] saa7134: card=38 -> Terratec Cinergy 200 TV 153b:1152 [12465.028749] saa7134: card=39 -> LifeView FlyTV Platinum Mini 5168:0212 4e42:0212 5169:1502 [12465.028760] saa7134: card=40 -> Compro VideoMate TV PVR/FM 185b:c100 [12465.028768] saa7134: card=41 -> Compro VideoMate TV Gold+ 185b:c100 [12465.028776] saa7134: card=42 -> Sabrent SBT-TVFM (saa7130) [12465.028783] saa7134: card=43 -> :Zolid Xpert TV7134 [12465.028790] saa7134: card=44 -> Empire PCI TV-Radio LE [12465.028796] saa7134: card=45 -> Avermedia AVerTV Studio 307 1461:9715 [12465.028805] saa7134: card=46 -> AVerMedia Cardbus TV/Radio (E500) 1461:d6ee [12465.028813] saa7134: card=47 -> Terratec Cinergy 400 mobile 153b:1162 [12465.028821] saa7134: card=48 -> Terratec Cinergy 600 TV MK3 153b:1158 [12465.028830] saa7134: card=49 -> Compro VideoMate Gold+ Pal 185b:c200 [12465.028838] saa7134: card=50 -> Pinnacle PCTV 300i DVB-T + PAL 11bd:002d [12465.028847] saa7134: card=51 -> ProVideo PV952 1540:9524 [12465.028855] saa7134: card=52 -> AverMedia AverTV/305 1461:2108 [12465.028863] saa7134: card=53 -> ASUS TV-FM 7135 1043:4845 [12465.028871] saa7134: card=54 -> LifeView FlyTV Platinum FM / Gold 5168:0214 5168:5214 1489:0214 5168:0304 [12465.028884] saa7134: card=55 -> LifeView FlyDVB-T DUO / MSI TV@nywhere D 5168:0306 4e42:0306 [12465.028894] saa7134: card=56 -> Avermedia AVerTV 307 1461:a70a [12465.028903] saa7134: card=57 -> Avermedia AVerTV GO 007 FM 1461:f31f [12465.028911] saa7134: card=58 -> ADS Tech Instant TV (saa7135) 1421:0350 1421:0351 1421:0370 1421:1370 [12465.028924] saa7134: card=59 -> Kworld/Tevion V-Stream Xpert TV PVR7134 [12465.028931] saa7134: card=60 -> LifeView/Typhoon/Genius FlyDVB-T Duo Car 5168:0502 4e42:0502 1489:0502 [12465.028942] saa7134: card=61 -> Philips TOUGH DVB-T reference design 1131:2004 [12465.028951] saa7134: card=62 -> Compro VideoMate TV Gold+II [12465.028958] saa7134: card=63 -> Kworld Xpert TV PVR7134 [12465.028964] saa7134: card=64 -> FlyTV mini Asus Digimatrix 1043:0210 [12465.028973] saa7134: card=65 -> V-Stream Studio TV Terminator [12465.028980] saa7134: card=66 -> Yuan TUN-900 (saa7135) [12465.028986] saa7134: card=67 -> Beholder BeholdTV 409 FM 0000:4091 [12465.028995] saa7134: card=68 -> GoTView 7135 PCI 5456:7135 [12465.029003] saa7134: card=69 -> Philips EUROPA V3 reference design 1131:2004 [12465.029011] saa7134: card=70 -> Compro Videomate DVB-T300 185b:c900 [12465.029020] saa7134: card=71 -> Compro Videomate DVB-T200 185b:c901 [12465.029028] saa7134: card=72 -> RTD Embedded Technologies VFG7350 1435:7350 [12465.029036] saa7134: card=73 -> RTD Embedded Technologies VFG7330 1435:7330 [12465.029045] saa7134: card=74 -> LifeView FlyTV Platinum Mini2 14c0:1212 [12465.029053] saa7134: card=75 -> AVerMedia AVerTVHD MCE A180 1461:1044 [12465.029062] saa7134: card=76 -> SKNet MonsterTV Mobile 1131:4ee9 [12465.029070] saa7134: card=77 -> Pinnacle PCTV 40i/50i/110i (saa7133) 11bd:002e [12465.029078] saa7134: card=78 -> ASUSTeK P7131 Dual 1043:4862 [12465.029087] saa7134: card=79 -> Sedna/MuchTV PC TV Cardbus TV/Radio (ITO [12465.029094] saa7134: card=80 -> ASUS Digimatrix TV 1043:0210 [12465.029102] saa7134: card=81 -> Philips Tiger reference design 1131:2018 [12465.029110] saa7134: card=82 -> MSI TV@Anywhere plus 1462:6231 1462:8624 [12465.029120] saa7134: card=83 -> Terratec Cinergy 250 PCI TV 153b:1160 [12465.029128] saa7134: card=84 -> LifeView FlyDVB Trio 5168:0319 [12465.029137] saa7134: card=85 -> AverTV DVB-T 777 1461:2c05 1461:2c05 [12465.029147] saa7134: card=86 -> LifeView FlyDVB-T / Genius VideoWonder D 5168:0301 1489:0301 [12465.029156] saa7134: card=87 -> ADS Instant TV Duo Cardbus PTV331 0331:1421 [12465.029165] saa7134: card=88 -> Tevion/KWorld DVB-T 220RF 17de:7201 [12465.029173] saa7134: card=89 -> ELSA EX-VISION 700TV 1048:226c [12465.029182] saa7134: card=90 -> Kworld ATSC110/115 17de:7350 17de:7352 [12465.029191] saa7134: card=91 -> AVerMedia A169 B 1461:7360 [12465.029200] saa7134: card=92 -> AVerMedia A169 B1 1461:6360 [12465.029208] saa7134: card=93 -> Medion 7134 Bridge #2 16be:0005 [12465.029216] saa7134: card=94 -> LifeView FlyDVB-T Hybrid Cardbus/MSI TV 5168:3306 5168:3502 5168:3307 4e42:3502 [12465.029229] saa7134: card=95 -> LifeView FlyVIDEO3000 (NTSC) 5169:0138 [12465.029238] saa7134: card=96 -> Medion Md8800 Quadro 16be:0007 16be:0008 16be:000d [12465.029249] saa7134: card=97 -> LifeView FlyDVB-S /Acorp TV134DS 5168:0300 4e42:0300 [12465.029259] saa7134: card=98 -> Proteus Pro 2309 0919:2003 [12465.029267] saa7134: card=99 -> AVerMedia TV Hybrid A16AR 1461:2c00 [12465.029276] saa7134: card=100 -> Asus Europa2 OEM 1043:4860 [12465.029284] saa7134: card=101 -> Pinnacle PCTV 310i 11bd:002f [12465.029293] saa7134: card=102 -> Avermedia AVerTV Studio 507 1461:9715 [12465.029301] saa7134: card=103 -> Compro Videomate DVB-T200A [12465.029308] saa7134: card=104 -> Hauppauge WinTV-HVR1110 DVB-T/Hybrid 0070:6700 0070:6701 0070:6702 0070:6703 0070:6704 0070:6705 [12465.029324] saa7134: card=105 -> Terratec Cinergy HT PCMCIA 153b:1172 [12465.029332] saa7134: card=106 -> Encore ENLTV 1131:2342 1131:2341 3016:2344 [12465.029344] saa7134: card=107 -> Encore ENLTV-FM 1131:230f [12465.029352] saa7134: card=108 -> Terratec Cinergy HT PCI 153b:1175 [12465.029360] saa7134: card=109 -> Philips Tiger - S Reference design [12465.029367] saa7134: card=110 -> Avermedia M102 1461:f31e [12465.029375] saa7134: card=111 -> ASUS P7131 4871 1043:4871 [12465.029384] saa7134: card=112 -> ASUSTeK P7131 Hybrid 1043:4876 [12465.029392] saa7134: card=113 -> Elitegroup ECS TVP3XP FM1246 Tuner Card 1019:4cb6 [12465.029401] saa7134: card=114 -> KWorld DVB-T 210 17de:7250 [12465.029409] saa7134: card=115 -> Sabrent PCMCIA TV-PCB05 0919:2003 [12465.029418] saa7134: card=116 -> 10MOONS TM300 TV Card 1131:2304 [12465.029426] saa7134: card=117 -> Avermedia Super 007 1461:f01d [12465.029435] saa7134: card=118 -> Beholder BeholdTV 401 0000:4016 [12465.029443] saa7134: card=119 -> Beholder BeholdTV 403 0000:4036 [12465.029451] saa7134: card=120 -> Beholder BeholdTV 403 FM 0000:4037 [12465.029459] saa7134: card=121 -> Beholder BeholdTV 405 0000:4050 [12465.029468] saa7134: card=122 -> Beholder BeholdTV 405 FM 0000:4051 [12465.029476] saa7134: card=123 -> Beholder BeholdTV 407 0000:4070 [12465.029484] saa7134: card=124 -> Beholder BeholdTV 407 FM 0000:4071 [12465.029493] saa7134: card=125 -> Beholder BeholdTV 409 0000:4090 [12465.029501] saa7134: card=126 -> Beholder BeholdTV 505 FM 5ace:5050 [12465.029510] saa7134: card=127 -> Beholder BeholdTV 507 FM / BeholdTV 509 5ace:5070 5ace:5090 [12465.029520] saa7134: card=128 -> Beholder BeholdTV Columbus TVFM 0000:5201 [12465.029528] saa7134: card=129 -> Beholder BeholdTV 607 FM 5ace:6070 [12465.029537] saa7134: card=130 -> Beholder BeholdTV M6 5ace:6190 [12465.029545] saa7134: card=131 -> Twinhan Hybrid DTV-DVB 3056 PCI 1822:0022 [12465.029554] saa7134: card=132 -> Genius TVGO AM11MCE [12465.029560] saa7134: card=133 -> NXP Snake DVB-S reference design [12465.029567] saa7134: card=134 -> Medion/Creatix CTX953 Hybrid 16be:0010 [12465.029576] saa7134: card=135 -> MSI TV@nywhere A/D v1.1 1462:8625 [12465.029584] saa7134: card=136 -> AVerMedia Cardbus TV/Radio (E506R) 1461:f436 [12465.029592] saa7134: card=137 -> AVerMedia Hybrid TV/Radio (A16D) 1461:f936 [12465.029601] saa7134: card=138 -> Avermedia M115 1461:a836 [12465.029609] saa7134: card=139 -> Compro VideoMate T750 185b:c900 [12465.029617] saa7134: card=140 -> Avermedia DVB-S Pro A700 1461:a7a1 [12465.029626] saa7134: card=141 -> Avermedia DVB-S Hybrid+FM A700 1461:a7a2 [12465.029634] saa7134: card=142 -> Beholder BeholdTV H6 5ace:6290 [12465.029642] saa7134: card=143 -> Beholder BeholdTV M63 5ace:6191 [12465.029651] saa7134: card=144 -> Beholder BeholdTV M6 Extra 5ace:6193 [12465.029659] saa7134: card=145 -> AVerMedia MiniPCI DVB-T Hybrid M103 1461:f636 1461:f736 [12465.029669] saa7134: card=146 -> ASUSTeK P7131 Analog [12465.029676] saa7134: card=147 -> Asus Tiger 3in1 1043:4878 [12465.029684] saa7134: card=148 -> Encore ENLTV-FM v5.3 1a7f:2008 [12465.029693] saa7134: card=149 -> Avermedia PCI pure analog (M135A) 1461:f11d [12465.029701] saa7134: card=150 -> Zogis Real Angel 220 [12465.029708] saa7134: card=151 -> ADS Tech Instant HDTV 1421:0380 [12465.029716] saa7134: card=152 -> Asus Tiger Rev:1.00 1043:4857 [12465.029725] saa7134: card=153 -> Kworld Plus TV Analog Lite PCI 17de:7128 [12465.029733] saa7134: card=154 -> Avermedia AVerTV GO 007 FM Plus 1461:f31d [12465.029742] saa7134: card=155 -> Hauppauge WinTV-HVR1150 ATSC/QAM-Hybrid 0070:6706 0070:6708 [12465.029752] saa7134: card=156 -> Hauppauge WinTV-HVR1120 DVB-T/Hybrid 0070:6707 0070:6709 0070:670a [12465.029763] saa7134: card=157 -> Avermedia AVerTV Studio 507UA 1461:a11b [12465.029772] saa7134: card=158 -> AVerMedia Cardbus TV/Radio (E501R) 1461:b7e9 [12465.029780] saa7134: card=159 -> Beholder BeholdTV 505 RDS 0000:505b [12465.029789] saa7134: card=160 -> Beholder BeholdTV 507 RDS 0000:5071 [12465.029797] saa7134: card=161 -> Beholder BeholdTV 507 RDS 0000:507b [12465.029806] saa7134: card=162 -> Beholder BeholdTV 607 FM 5ace:6071 [12465.029815] saa7134: card=163 -> Beholder BeholdTV 609 FM 5ace:6090 [12465.029823] saa7134: card=164 -> Beholder BeholdTV 609 FM 5ace:6091 [12465.029832] saa7134: card=165 -> Beholder BeholdTV 607 RDS 5ace:6072 [12465.029840] saa7134: card=166 -> Beholder BeholdTV 607 RDS 5ace:6073 [12465.029849] saa7134: card=167 -> Beholder BeholdTV 609 RDS 5ace:6092 [12465.029857] saa7134: card=168 -> Beholder BeholdTV 609 RDS 5ace:6093 [12465.029866] saa7134: card=169 -> Compro VideoMate S350/S300 185b:c900 [12465.029874] saa7134: card=170 -> AverMedia AverTV Studio 505 1461:a115 [12465.029883] saa7134: card=171 -> Beholder BeholdTV X7 5ace:7595 [12465.029892] saa7134: card=172 -> RoverMedia TV Link Pro FM 19d1:0138 [12465.029900] saa7134: card=173 -> Zolid Hybrid TV Tuner PCI 1131:2004 [12465.029909] saa7134: card=174 -> Asus Europa Hybrid OEM 1043:4847 [12465.029917] saa7134: card=175 -> Leadtek Winfast DTV1000S 107d:6655 [12465.029926] saa7134: card=176 -> Beholder BeholdTV 505 RDS 0000:5051 [12465.029934] saa7134: card=177 -> Hawell HW-404M7 [12465.029941] saa7134: card=178 -> Beholder BeholdTV H7 [12465.029948] saa7134: card=179 -> Beholder BeholdTV A7 [12465.029955] saa7134: card=180 -> Avermedia PCI M733A 1461:4155 1461:4255 [12465.029967] saa7130[0]: subsystem: 1131:0000, board: UNKNOWN/GENERIC [card=0,autodetected] [12465.030033] saa7130[0]: can't get MMIO memory @ 0x0 [12465.030051] saa7134: probe of 0000:01:07.0 failed with error -16 [12465.053892] saa7134 ALSA driver for DMA sound loaded [12465.053900] saa7134 ALSA: no saa7134 cards found tvtime-scanner Leyendo la configuración de /etc/tvtime/tvtime.xml Leyendo la configuración de /home/ricardo/.tvtime/tvtime.xml Escaneando usando la norma de TV NTSC. /home/ricardo/.tvtime/stationlist.xml: No existing NTSC station list "Custom". videoinput: Cannot open capture device /dev/video0: No existe el dispositivo o la dirección ls /dev/ No video directory here

    Read the article

  • CodePlex Daily Summary for Friday, May 07, 2010

    CodePlex Daily Summary for Friday, May 07, 2010New ProjectsBibleBrowser: BibleBrowserBibleMaps: BibleMapsChristianLibrary: ChristianLibraryCLB Podcast Module: DotNetNuke Module used to allow DNN to host one or more podcasts within a portal.Coletivo InVitro: Nova versão do Site do ColetivoCustomer Care Accelerator for Microsoft Dynamics CRM: Customer Care Accelerator for Microsoft Dynamics CRM.EasyTFS: A very lightweight, quick, web-based search application for Team Foundation Server. EasyTfs searches as you type, providing real-time search resul...FSCommunity: abcGeocache Downloader: GeocacheDownloader helps you download geocache information in an organised way, making easier to copy the information to your device. The applicati...Grabouille: Grabouille aims to be an incubation project for Microsoft best patterns & practices and also a container for last .Net technologies. The goal is, i...Klaverjas: Test application for testing different new technologies in .NET (WCF, DataServices, C# stuff, Entity...etc.)Livecity: Social network. Alpha 0.1MarxSupples: testMOSS 2007 - Excel Services: This helps you understand MOSS 2007 - Excel Services and how to use the same in .NETmy site: a personal web siteNazTek.Extension.Clr35: Contains a set of CLR 3.5 extensions and utility APInetDumbster: netDumbster is a .Net Fake SMTP Server clone of the popular Dumbster (http://quintanasoft.com/dumbster/) netDumbster is based on the API of nDumbs...Object-Oriented Optimization Toolbox (OOOT): A library (.dll) of various linear, nonlinear, and stochastic numerical optimization techniques. While some of these are older than 50 years, they ...OMap - Object to Object Mapper: OMap is a simple object to object mapper. It could be used for scenarios like mapping your data from domain objects into data transfer objects.PDF Renderer for BlackBerry.: Render and view PDF files on BlackBerry using a modified version of Sun's PDF Renderer.Pomodoro Tool: Pomodoro Tool is a timer for http://www.pomodorotechnique.com/ . It's a timer and task tracker with a text task editing interface.ReadingPlan: ReadingPlanRil#: .net library to use the public Readitlater.com public APISCSM Incident SLA Management: This project provides an extension to System Center Service Manager to provide more granular control over incident service level agreement (SLA) ma...SEAH - Sistema Especialista de Agravante de Hipertensão: O SEAH tem como propósito alertar o indivíduo em relação ao seu agravante de hipertensão arterial e a órgãos competentes, entidades de ensino, pesq...StudyGuide: StudyGuideTest Project (ignore): This is used to demonstrate CodePlex at meetings. Please ignore this project.YCC: YCC is an open source c compiler which compatible with ANSI standard.The project is currently an origin start.We will work it for finally useable a...New ReleasesAlbum photo de club - Club's Photos Album: App - version 0.5: Modifications : - Ajout des favoris - Ajout de l'update automatique /*/ - Add favorites - Add automatic updateBoxee Launcher: Boxee Launcher 1.0.1.5: Boxee Launcher finds the BOXEE executable using a registry key that BOXEE creates. The new version of BOXEE changed the location. Boxee Launcher ha...CBM-Command: 2010-05-06: Release Notes - 2010-05-06New Features Creating Directories Deleting Files and Directories Renaming Files and Directories Changes 40 columns i...Customer Care Accelerator for Microsoft Dynamics CRM: Customer Care Accelerator for Dynamics CRM R1: The Customer Care Accelerator (CCA) for Microsoft Dynamics CRM focuses on delivering contact center enabling functionality, such as the ability to ...D-AMPS: D-AMPS 0.9.2: Add .bat files for command-line running Bug fixed (core engine) Section 6, 8, 9 modifications Sources (Fortran) for core engineDynamicJson: Release 1.1.0.0: Add - foreach support Add - Dynamic Shortcut of IsDefined,Delete,Deserialize Fix - Deserialize Delete - LengthEasyTFS: EasyTfs 1.0 Beta 1: A very lightweight, quick, web-based search application for Team Foundation Server. EasyTfs searches as you type, providing real-time search resul...Event Scavenger: Add installer for Admin tool: Added installer for Admin tool. Removed exe's for admin and viewer from zip file - were replaced by the msi installers.Expression Blend Samples: PathListBoxUtils for Expression Blend 4 RC: Initial release of the PathListBoxUtils samples.HackingSilverlight Code Browser: HackingSilverlight Code Browser: Out with the old and in with the new... the HackingSilverlight Code Browser is a reference tool for code snippets so that I can not have to remembe...Hammock for REST: Hammock v1.0.3: v1.0.3 ChangesFixes for OAuth escaping and API usage Added FollowRedirects feature to RestClient/RestRequest v1.0.2 Changes.NET 4.0 and Client P...ImmlPad: ImmlPad Beta 1.1.1: Changes in this release: Added more intelligent right-click menu's to allow opening an IMML document with a specific Player version Fixed issue w...LinkedIn® for Windows Mobile: LinkedIn for Windows Mobile v0.8: Improved error message dumping + moved OAuth parameters from www.* to api.* In case of unexpected errors, check "Application Data\LinkedIn for Wind...Live-Exchange Calendar Sync: Installer: Alpha release of Live-Exchange Calendar SyncMAPILab Explorer for SharePoint: MAPILab Explorer for SharePoint ver 2.1.0: 1) Get settings form old versions 2) Rules added to display enumerable object items. 3) Bug fixed with remove persisted object How to install:Do...MapWindow6: MapWindow 6.0 msi May 6, 2010: This release enables output .prj files to also show the ESRI names for the PRJCS, GEOCS, and the DATUM. It also fixes a bug that was preventing th...MOSS 2007 - Excel Services: Calculator using Excel Services: Simple calculator using Excel ServicesMvcMaps - Unified Bing/Google Mapping API for ASP.NET MVC: MvcMaps Preview 1 for ASP.NET 4.0 and VS'2010: There was a change in ASP.NET 4.0 that broke the release, so a small modification needed to be made to the reflection code. This release fixes that...NazTek.Extension.Clr35: NazTek.Extension.Clr35 Binary Cab: Binary cab fileNazTek.Extension.Clr35: NazTek.Extension.Clr35 Source Cab: Source codePDF Renderer for BlackBerry.: PDF Renderer 0.1 for BlackBerry: This library requires a BlackBerry Signing Key in order to compile for use on a BlackBerry device. Signing keys can be obtained at BlackBerry Code ...Pomodoro Tool: PomodoroTool Clickonce installer: PomodoroTool Clickonce installerPOS for .Net Handheld Products Service Object: POS for .Net Handheld Products Service Object 1002: New version (1.0.0.2) which should support 64 bit platforms (see ReadMe.txt included with source for details). Source code only.QuestTracker: QuestTracker 0.4: What's New in QuestTracker 0.4 - - You can now drag and drop the quests on the left pane to rearrange or move quests from one group to another. - D...RDA Collaboration Team Projects: Property Bag Cmdlet: This cmdlet allows to retrieve, insert and update property bag values at farm, web app, site and web scope. The same operations can be in code usi...Ril#: Rilsharp 1.0: The first version of the Ril# (Readitlater sharp) library.Scrum Sprint Monitor: v1.0.0.47911 (.NET 4-TFS 2010): What is new in this release? Migrated to .NET Framework 4 RTM; Compiled against TFS 2010 RTM Client DLLs; Smoother animations with easing funct...SCSM Incident SLA Management: SCSM Incident SLA Management Version 0.1: This is the first release of the SCSM SLA Management solution. It is an 'alpha' release and has only been tested by the developers on the project....StackOverflow Desktop Client in C# and WPF: StackOverflow Client 0.4: Shows a popup that displays all the new questions and allows you to navigate between them. Fixed a bug that showed incorrect views and answers in t...Transcriber: Transcriber V0.1: Pre-release, usable but very rough.VCC: Latest build, v2.1.30506.0: Automatic drop of latest buildVisual Studio CSLA Extension for ADO.NET Entity Framework: CslaExtension Beta1: Requirements Visual Studio 2010 CSLA 4.0. Beta 1 Installation Download VSIX file and double click to install. Open Visual Studio -> Tools -> Exte...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight Toolkitpatterns & practices – Enterprise LibraryWindows Presentation Foundation (WPF)ASP.NETDotNetNuke® Community EditionMicrosoft SQL Server Community & SamplesMost Active Projectspatterns & practices – Enterprise LibraryAJAX Control FrameworkIonics Isapi Rewrite FilterRawrpatterns & practices: Azure Security GuidanceCaliburn: An Application Framework for WPF and SilverlightBlogEngine.NETTweetSharpNB_Store - Free DotNetNuke Ecommerce Catalog ModuleTinyProject

    Read the article

  • CodePlex Daily Summary for Thursday, January 13, 2011

    CodePlex Daily Summary for Thursday, January 13, 2011Popular ReleasesMVC Music Store: MVC Music Store v2.0: This is the 2.0 release of the MVC Music Store Tutorial. This tutorial is updated for ASP.NET MVC 3 and Entity Framework Code-First, and contains fixes and improvements based on feedback and common questions from previous releases. The main download, MvcMusicStore-v2.0.zip, contains everything you need to build the sample application, including A detailed tutorial document in PDF format Assets you will need to build the project, including images, a stylesheet, and a pre-populated databas...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.7 GA Released: Hi, Today we are releasing Visifire 3.6.7 GA with the following feature: * Inlines property has been implemented in Title. Also, this release contains fix for the following bugs: * In Column and Bar chart DataPoint’s label properties were not working as expected at real-time if marker enabled was set to true. * 3D Column and Bar chart were not rendered properly if AxisMinimum property was set in x-axis. You can download Visifire v3.6.7 here. Cheers, Team VisifireFluent Validation for .NET: 2.0: Changes since 2.0 RC Fix typo in the name of FallbackAwareResourceAccessorBuilder Fix issue #7062 - allow validator selectors to work against nullable properties with overriden names. Fix error in German localization. Better support for client-side validation messages in MVC integration. All changes since 1.3 Allow custom MVC ModelValidators to be added to the FVModelValidatorProvider Support resource provider for custom property validators through the new IResourceAccessorBuilder ...EnhSim: EnhSim 2.3.2 ALPHA: 2.3.1 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Quick update to ...ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.6: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: paging for the lookup lookup with multiselect changes: the css classes used by the framework where renamed to be more standard the lookup controller requries an item.ascx (no more ViewData["structure"]), and LookupList action renamed to Search all the...pwTools: pwTools: Changelog v1.0 base release??????????: All-In-One Code Framework ??? 2011-01-12: 2011???????All-In-One Code Framework(??) 2011?1??????!!http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 ?????release?,???????ASP.NET, AJAX, WinForm, Windows Shell????13?Sample Code。???,??????????sample code。 ?????:http://blog.csdn.net/sjb5201/archive/2011/01/13/6135037.aspx ??,??????MSDN????????????。 http://social.msdn.microsoft.com/Forums/zh-CN/codezhchs/threads ?????????????????,??Email ????patterns & practices – Enterprise Library: Enterprise Library 5.0 - Extensibility Labs: This is a preview release of the Hands-on Labs to help you learn and practice different ways the Enterprise Library can be extended. Learning MapCustom exception handler (estimated time to complete: 1 hr 15 mins) Custom logging trace listener (1 hr) Custom configuration source (registry-based) (30 mins) System requirementsEnterprise Library 5.0 / Unity 2.0 installed SQL Express 2008 installed Visual Studio 2010 Pro (or better) installed AuthorsChris Tavares, Microsoft Corporation ...Orchard Project: Orchard 1.0: Orchard Release Notes Build: 1.0.20 Published: 1/12/2010 How to Install OrchardTo install the Orchard tech preview using Web PI, follow these instructions: http://www.orchardproject.net/docs/Installing-Orchard.ashx Web PI will detect your hardware environment and install the application. --OR-- Alternatively, to install the release manually, download the Orchard.Web.1.0.20.zip file. The zip contents are pre-built and ready-to-run. Simply extract the contents of the Orchard folder from ...Umbraco CMS: Umbraco 4.6.1: The Umbraco 4.6.1 (codename JUNO) release contains many new features focusing on an improved installation experience, a number of robust developer features, and contains nearly 200 bug fixes since the 4.5.2 release. Getting Started A great place to start is with our Getting Started Guide: Getting Started Guide: http://umbraco.codeplex.com/Project/Download/FileDownload.aspx?DownloadId=197051 Make sure to check the free foundation videos on how to get started building Umbraco sites. They're ...Google URL Shortener API for .NET: Google URL Shortener API v1: According follow specification: http://code.google.com/apis/urlshortener/v1/reference.htmlStyleCop for ReSharper: StyleCop for ReSharper 5.1.14986.000: A considerable amount of work has gone into this release: Features: Huge focus on performance around the violation scanning subsystem: - caching added to reduce IO operations around reading and merging of settings files - caching added to reduce creation of expensive objects Users should notice condsiderable perf boost and a decrease in memory usage. Bug Fixes: - StyleCop's new ObjectBasedEnvironment object does not resolve the StyleCop installation path, thus it does not return the ...SQL Monitor - tracking sql server activities: SQL Monitor 3.1 beta 1: 1. support alert message template 2. dynamic toolbar commands depending on functionality 3. fixed some bugs 4. refactored part of the code, now more stable and more clean upFacebook C# SDK: 4.2.1: - Authentication bug fixes - Updated Json.Net to version 4.0.0 - BREAKING CHANGE: Removed cookieSupport config setting, now automatic. This download is also availible on NuGet: Facebook FacebookWeb FacebookWebMvcHawkeye - The .Net Runtime Object Editor: Hawkeye 1.2.5: In the case you are running an x86 Windows and you installed Release 1.2.4, you should consider upgrading to this release (1.2.5) as it appears Hawkeye is broken on x86 OS. I apologize for the inconvenience, but it appears Hawkeye 1.2.4 (and probably previous versions) doesn't run on x86 Windows (See issue http://hawkeye.codeplex.com/workitem/7791). This maintenance release fixes this broken behavior. This release comes in two flavors: Hawkeye.125.N2 is the standard .NET 2 build, was compile...Phalanger - The PHP Language Compiler for the .NET Framework: 2.0 (January 2011): Another release build for daily use; it contains many new features, enhanced compatibility with latest PHP opensource applications and several issue fixes. To improve the performance of your application using MySQL, please use Managed MySQL Extension for Phalanger. Changes made within this release include following: New features available only in Phalanger. Full support of Multi-Script-Assemblies was implemented; you can build your application into several DLLs now. Deploy them separately t...AutoLoL: AutoLoL v1.5.3: A message will be displayed when there's an update available Shows a list of recent mastery files in the Editor Tab (requested by quite a few people) Updater: Update information is now scrollable Added a buton to launch AutoLoL after updating is finished Updated the UI to match that of AutoLoL Fix: Detects and resolves 'Read Only' state on Version.xmlTweetSharp: TweetSharp v2.0.0.0 - Preview 7: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 7 ChangesFixes the regression issue in OAuth from Preview 6 Preview 6 ChangesMaintenance release with user reported fixes Preview 5 ChangesMaintenance release with user reported fixes Third Party Library VersionsHammock v1.0.6: http://hammock.codeplex.com Json.NET 3.5 Release 8: http://json.codeplex.comExtended WPF Toolkit: Extended WPF Toolkit - 1.3.0: What's in the 1.3.0 Release?BusyIndicator ButtonSpinner ChildWindow ColorPicker - Updated (Breaking Changes) DateTimeUpDown - New Control Magnifier - New Control MaskedTextBox - New Control MessageBox NumericUpDown RichTextBox RichTextBoxFormatBar - Updated .NET 3.5 binaries and SourcePlease note: The Extended WPF Toolkit 3.5 is dependent on .NET Framework 3.5 and the WPFToolkit. You must install .NET Framework 3.5 and the WPFToolkit in order to use any features in the To...Ionics Isapi Rewrite Filter: 2.1 latest stable: V2.1 is stable, and is in maintenance mode. This is v2.1.1.25. It is a bug-fix release. There are no new features. 28629 29172 28722 27626 28074 29164 27659 27900 many documentation updates and fixes proper x64 build environment. This release includes x64 binaries in zip form, but no x64 MSI file. You'll have to manually install x64 servers, following the instructions in the documentation.New Projects4chan: Project for educational purposesAE.Net.Mail - POP/IMAP Client: C# POP/IMAP client libraryAlmathy: Application communautaire pour le partage de données basée sur le protocole XMPP. Discussion instantanée, mail, échange de données, travaux partagés. Développée en C#, utilisant les Windows Forms.AMK - Associação Metropolitana de Kendo: Projeto do site versão 2011 da Associação Metropolitana de Kendo. O site contará com: - Cadastro de atletas e eventos; - Controle de cadastro junto a CBK e histórico de graduações; - Calendário de treinos; Será desenvolvido usando .NET, MVC2, Entity Framework, SQL Server, JQueryAzke: New: Azke is a portal developed with ASP.NET MVC and MySQL. Old: Azke is a portal developed with ASP.net and MySQL.BuildScreen: A standalone Windows application to displays project build statuses from TeamCity. It can be used on a large screen for the whole development team to watch their build statuses as well as on a developer machine.CardOnline: CardOnline GameLobby GameCAudioEndpointVolume: The CAudioEndpointVolume implements the IAudioEndpointVolume Interface to control the master volume in Visual Basic 6.0 for Windows Vista and later operating systems.Cloudy - Online Storage Library: The goal of Cloudy is to be an online storage library to enable access for the most common storage services : DropBox, Skydrive, Google Docs, Azure Blob Storage and Amazon S3. It'll work in .NET 3.5 and up, Silverlight and Windows Phone 7 (WP7).ContentManager: Content Manger for SAP Kpro exports the data pool of a content-server by PHIOS-Lists (txt-files) into local binary files. Afterwards this files can be imported in a SAP content-repository . The whole List of the PHIOS Keys can also be downloaded an splitted in practical units.CSWFKrad: private FrameworkDiego: Diegodoomhjx_javalib: this is my lib for the java project.Eve Planetary Interaction: Eve Planetary InteractionEventWall: EventWall allows you to show related blogposts and tweets on the big screen. Just configure the hashtag and blog RSS feeds and you're done.Google URL Shortener API for .NET: Google URL Shortener API for .NET is a wrapper for the Google Project below: http://code.google.com/apis/urlshortener/v1/reference.html With Google URL Shortener API, you may shorten urls and get some analytics information about this. It's developer in C# 4.0 and VS2010. HtmlDeploy: A project to compile asp's Master page into pure html files. The idea is to use jQuery templating to do all the "if" and "for" when it comes to generating html output. Inventory Management System: Inventory Management System is a Silverlight-based system. It aims to manage and control the input raw material activites, output of finished goods of a small inventory.koplamp kapot: demo project voor AzureMSAccess SVN: Access SVN adds to Microsoft Access (MS Access) support for SVN Source controlMultiConvert: console based utility primary to convert solid edge files into data exchange formats like *.stp and *.dxf. It's using the API of the original software and can't be run alone. The goal of this project is creating routines with desired pre-instructions for batch conversionsNGN Image Getter: Nationalgeographic has really stunning photos! They allow to download them via website. This program automates that process.OOD CRM: Project CRM for Object Oriented Development final examsOpen NOS Client: Open NOS Rest ClientopenEHR.NET: openEHR.NET is a C# implementation of openEHR (http://openehr.org) Reference Model (RM) and Archetype Model (AM) specifications (Release 1.0.1). It allows you to build openEHR applications by composing RM objects, validate against AM objects and serialise to/from XML.OpenGL Flow Designer: OpenGL Flow Designer allows writing pseudo-C code to build OpenGL pipeline and preview results immediately.Orchard Translation Manager: An Orchard module that aims at facilitating the creation and management of translations of the Orchard CMS and its extensions.pwTools: A set of tools for viewing/editing some perfect world client/server filesServerStatusMobile: Server availability monitoring application for windows mobile 6.5.x running on WVGA (480x800) devices. Supports autorun, vibration & notification when server became unavailable, text log files for each server. .NET Compact Framework 3.5 required for running this applicationSimple Silverlight Multiple File Uploader: The project allows you to achieve multiple file uploading in Silverlight without complex, complicated, or third-party applications. In just two core classes, it's very simple to understand, use, and extend. Snos: Projet Snos linker Snes multi SupportT-4 Templates for ASP.NET Web Form Databound Control Friendly Logical Layers: This open source project includes a set of T-4 templates to enable you to build logical layers (i.e. DAL/BLL) with just few clicks! The logical layers implemented here are based on Entity Framework 4.0, ASP.NET Web Form Data Bound control friendly and fully unit testable.The Media Store: The Media StoreUM: source control for the um projecturBook - Your Address Book Manager: urBook is and Address Book Manager that can merge all your contacts on all web services you use with your phone contacts to have one consistent Address Book for all your contacts and to update back your web services contacts with the one consistent address bookWindows Phone Certificate Installer: Helps install Trusted Root Certificates on Windows Phone 7 to enable SSL requests. Intended to allow developers to use localhost web servers during development without requiring the purchase of an SSL certificate. Especially helpful for ws-trust secured web service development.

    Read the article

  • The future for Microsoft

    - by Scott Dorman
    Originally posted on: http://geekswithblogs.net/sdorman/archive/2013/10/16/the-future-for-microsoft.aspxMicrosoft is in the process of reinventing itself. While some may argue that it’s “too little, too late” or that their growing consumer-focused strategy is wrong, the truth of the situation is that Microsoft is reinventing itself into a new company. While Microsoft is now calling themselves a “devices and services” company, that’s not entirely accurate. Let’s look at some facts: Microsoft will always (for the long-term foreseeable future) be financially split into the following divisions: Windows/Operating Systems, which for FY13 made up approximately 24% of overall revenue. Server and Tools, which for FY13 made up approximately 26% of overall revenue. Enterprise/Business Products, which for FY13 made up approximately 32% of overall revenue. Entertainment and Devices, which for FY13 made up approximately 13% of overall revenue. Online Services, which for FY13 made up approximately 4% of overall revenue. It is important to realize that hardware products like the Surface fall under the Windows/Operating Systems division while products like the Xbox 360 fall under the Entertainment and Devices division. (Presumably other hardware, such as mice, keyboards, and cameras, also fall under the Entertainment and Devices division.) It’s also unclear where Microsoft’s recent acquisition of Nokia’s handset division will fall, but let’s assume that it will be under Entertainment and Devices as well. Now, for the sake of argument, let’s assume a slightly different structure that I think is more in line with how Microsoft presents itself and how the general public sees it: Consumer Products and Devices, which would probably make up approximately 9% of overall revenue. Developer Tools, which would probably make up approximately 13% of overall revenue. Enterprise Products and Devices, which would probably make up approximately 47% of overall revenue. Entertainment, which would probably make up approximately 13% of overall revenue. Online Services, which would probably make up approximately 17% of overall revenue. (Just so we’re clear, in this structure hardware products like the Surface, a portion of Windows sales, and other hardware fall under the Consumer Products and Devices division. I’m assuming that more of the income for the Windows division is coming from enterprise/volume licenses so 15% of that income went to the Enterprise Products and Devices division. Most of the enterprise services, like Azure, fall under the Online Services division so half of the Server and Tools income went there as well.) No matter how you look at it, the bulk of Microsoft’s income still comes from not just the enterprise but also software sales, and this really shouldn’t surprise anyone. So, now that the stage is set…what’s the future for Microsoft? The future I see for Microsoft (again, this is just my prediction based on my own instinct, gut-feel and publicly available information) is this: Microsoft is becoming a consumer-focused enterprise company. Let’s look at it a different way. Microsoft is an enterprise-focused company trying to create a larger consumer presence.  To a large extent, this is the exact opposite of Apple, who is really a consumer-focused company trying to create a larger enterprise presence. The major reason consumer-focused companies (like Apple) have started making in-roads into the enterprise is the “bring your own device” phenomenon. Yes, Apple has created some “game-changing” products but their enterprise influence is still relatively small. Unfortunately (for this blog post at least), Apple provides revenue in terms of hardware products rather than business divisions, so it’s not possible to do a direct comparison. However, in the interest of transparency, from Apple’s Quarterly Report (filed 24 July 2013), their revenue breakdown is: iPhone, which for the 3 months ending 29 June 2013 made up approximately 51% of revenue. iPad, which for the 3 months ending 29 June 2013 made up approximately 18% of revenue. Mac, which for the 3 months ending 29 June 2013 made up approximately 14% of revenue. iPod, which for the 3 months ending 29 June 2013 made up approximately 2% of revenue. iTunes, Software, and Services, which for the 3 months ending 29 June 2013 made up approximately 11% of revenue. Accessories, which for the 3 months ending 29 July 2013 made up approximately 3% of revenue. From this, it’s pretty clear that Apple is a consumer-and-hardware-focused company. At this point, you may be asking yourself “Where is all of this going?” The answer to that lies in Microsoft’s shift in company focus. They are becoming more consumer focused, but what exactly does that mean? The biggest change (at least that’s been in the news lately) is the pending purchase of Nokia’s handset division. This, in combination with their Surface line of tablets and the Xbox, will put Microsoft squarely in the realm of a hardware-focused company in addition to being a software-focused company. That can (and most likely will) shift the revenue split to looking at revenue based on software sales (both consumer and enterprise) and also hardware sales (mostly on the consumer side). If we look at things strictly from a Windows perspective, Microsoft clearly has a lot of irons in the fire at the moment. Discounting the various product SKUs available and painting the picture with broader strokes, there are currently 5 different Windows-based operating systems: Windows Phone Windows Phone 7.x, which runs on top of the Windows CE kernel Windows Phone 8.x+, which runs on top of the Windows 8 kernel Windows RT The ARM-based version of Windows 8, which runs on top of the Windows 8 kernel Windows (Pro) The Intel-based version of Windows 8, which runs on top of the Windows 8 kernel Xbox The Xbox 360, which runs it’s own proprietary OS. The Xbox One, which runs it’s own proprietary OS, a version of Windows running on top of the Windows 8 kernel and a proprietary “manager” OS which manages the other two. Over time, Windows Phone 7.x devices will fade so that really leaves 4 different versions. Looking at Windows RT and Windows Phone 8.x paints an interesting story. Right now, all mobile phone devices run on some sort of ARM chip and that doesn’t look like it will change any time soon. That means Microsoft has two different Windows based operating systems for the ARM platform. Long term, it doesn’t make sense for Microsoft to continue supporting that arrangement. I have long suspected (since the Surface was first announced) that Microsoft will unify these two variants of Windows and recent speculation from some of the leading Microsoft watchers lends credence to this suspicion. It is rumored that upcoming Windows Phone releases will include support for larger screen sizes, relax the requirement to have a hardware-based back button and will continue to improve API parity between Windows Phone and Windows RT. At the same time, Windows RT will include support for smaller screen sizes. Since both of these operating systems are based on the same core Windows kernel, it makes sense (both from a financial and development resource perspective) for Microsoft to unify them. The user interfaces are already very similar. So similar in fact, that visually it’s difficult to tell them apart. To illustrate this, here are two screen captures: Other than a few variations (the Bing News app, the picture shown in the Pictures tile and the spacing between the tiles) these are identical. The one on the left is from my Windows 8.1 laptop (which looks the same as on my Surface RT) and the one on the right is from my Windows Phone 8 Lumia 925. This pretty clearly shows that from a consumer perspective, there really is no practical difference between how these two operating systems look and how you interact with them. For the consumer, your entertainment device (Xbox One), phone (Windows Phone) and mobile computing device (Surface [or some other vendors tablet], laptop, netbook or ultrabook) and your desktop computing device (desktop) will all look and feel the same. While many people will denounce this consistency of user experience, I think this will be a good thing in the long term, especially for the upcoming generations. For example, my 5-year old son knows how to use my tablet, phone and Xbox because they all feature nearly identical user experiences. When Windows 8 was released, Microsoft allowed a Windows Store app to be purchased once and installed on as many as 5 devices. With Windows 8.1, this limit has been increased to over 50. Why is that important? If you consider that your phone, computing devices, and entertainment device will be running the same operating system (with minor differences related to physical hardware chipset), that means that I could potentially purchase my sons favorite Angry Birds game once and be able to install it on all of the devices I own. (And for those of you wondering, it’s only 7 [at the moment].) From an app developer perspective, the story becomes even more compelling. Right now there are differences between the different operating systems, but those differences are shrinking. The user interface technology for both is XAML but there are different controls available and different user experience concepts. Some of the APIs available are the same while some are not. You can’t develop a Windows Phone app that can also run on Windows (either Windows Pro or RT). With each release of Windows Phone and Windows RT, those difference become smaller and smaller. Add to this mix the Xbox One, which will also feature a Windows-based operating system and the same “modern” (tile-based) user interface and the visible distinctions between the operating systems will become even smaller. Unifying the operating systems means one set of APIs and one code base to maintain for an app that can run on multiple devices. One code base means it’s easier to add features and fix bugs and that those changes become available on all devices at the same time. It also means a single app store, which will increase the discoverability and reach of your app and consolidate revenue and app profile management. Now, the choice of what devices an app is available on becomes a simple checkbox decision rather than a technical limitation. Ultimately, this means more apps available to consumers, which is always good for the app ecosystem. Is all of this just rumor, speculation and conjecture? Of course, but it’s not unfounded. As I mentioned earlier, some of the prominent Microsoft watchers are also reporting similar rumors. However, Microsoft itself has even hinted at this future with their recent organizational changes and by telling developers “if you want to develop for Xbox One, start developing for Windows 8 now.” I think this pretty clearly paints the following picture: Microsoft is committed to the “modern” user interface paradigm. Microsoft is changing their release cadence (for all products, not just operating systems) to be faster and more modular. Microsoft is going to continue to unify their OS platforms both from a consumer perspective and a developer perspective. While this direction will certainly concern some people it will excite many others. Microsoft’s biggest failing has always been following through with a strong and sustained marketing strategy that presents a consistent view point and highlights what this unified and connected experience looks like and how it benefits consumers and enterprises. We’ve started to see some of this over the last few years, but it needs to continue and become more aggressive and consistent. In the long run, I think Microsoft will be able to pull all of these technologies and devices together into one seamless ecosystem. It isn’t going to happen overnight, but my prediction is that we will be there by the end of 2016. As both a consumer and a developer, I, for one, am excited about the future of Microsoft.

    Read the article

< Previous Page | 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104  | Next Page >