Search Results

Search found 106 results on 5 pages for 'citrus rain'.

Page 4/5 | < Previous Page | 1 2 3 4 5  | Next Page >

  • My First 5K, the recap.

    - by Chris Williams
    It was a nice day to be outside (and trust me, those aren't words you'll hear from me often.) I got to the site around 7:45, hit the pre-reg table and got my number along with a goody bag full of coupons for racing gear, a water bottle and a tshirt. Oh and a map. Stashed all that stuff in the jeep, emptied my pockets of everything but my iPhone and my jeep key, and proceeded to walk around for a bit as people started showing up and signing in. It was fairly breezy, and there was definitely a storm coming... but it was anyone's guess on when it would actually arrive. It was interesting to see everyone who was participating. If I had to guess, I would say the event was 60-70% women, with a pretty broad distribution of age... as young as 13 to well over 60 (in both genders.) I don't know exactly how many folks were there, but it was well over 300. Eventually it was time to kick things off, and everyone made their way to the start line. All of the 5k and 10k runners were mixed together, starting at the same time. All the walkers and the people with strollers or dogs were in the back. It was pretty chaotic at first, once things started, but it thinned out fairly quickly. The 10k people and the hardcore runners sped ahead of everyone else and the walkers gradually lagged behind. The 5K course was pretty nice, winding around a lake down in Eden Prairie. The 10K course overlapped most of ours, but branched off a couple times too. I didn't run the whole time, but I started the race running and I ended it running, and did a mix of walking and running along the way. I met my goals, which were a) don't ever stop and b) don't be last. The weather managed to hold out for the entire race. It never got too hot, there was a nice breeze and it was mostly overcast. Pretty much perfect in my book. About 20-30 minutes after I left, the rain came down pretty hard. I had a good time, and will most likely do more of them. We'll see.

    Read the article

  • Oracle Open World - the technologist's fall classic

    - by user581320
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Well its September and the calendar is charging towards fall. I realized today that another summer had passed as the first rain in months fell here in northern California redwood country. If its fall that means that Oracle Open World is right around the corner and that tens of thousands of technology professionals will soon converge on San Francisco to fill every hotel and every corner of their minds. As in years past, team UPK will be there in force to answer questions, demo and discuss all things UPK. On Thursday, October 4, from 12:45pm to 1:45pm I along with several of my UPK teammates will be manning the User Productivity Kit Panel - Best Practices to Manage and Deploy Content. This session is an interactive session intended as an opportunity to get your UPK questions answered. To get things started we will answer some questions submitted in advance. If you have any questions or subject areas you’d like addressed during the session, let me know here in the blog and then come to the session and we’ll do our best to answer your questions.  Peter Maravelias UPK Product Management

    Read the article

  • Is there any way to abstract IP address during ssh?

    - by Vivek V K
    I have a server which is in the middle of a forest. It is connected to the Internet via a microwave link and an ADSL link.Hence it has two different static IP addresses. Now if there is heavy rain, the microwave link breaks and I should use the much slower ADSL link. And I ping the microwave ip time to time to check if it is up again . But at times, I end up using the very slow ADSL link even if the microwave link is back up. Hence I need a way to automate this in the following way. 1.I need to abstract the IP address of the machine with some other name which when I use ssh or sftp, will poll both the IP and connect me to the best one. so for eg: if I say ssh -Y name@server, It should first try to connect to the microwave link if it cant, then connect to ADSL. 2.Suppose the first time I connect, the microwave link is down so it connects to ADSL, I need it to dynamically change to the microwave link once it is working again. Is this even possible?

    Read the article

  • windows 95 virtualization and CPU idle

    - by brtlvrs
    We are in the situation that we need to migrate our windows 95 systems (i know, that is so last century ). There is no hardware for a reasonable price available to run our windows 95 systems. So we are extending the overdue lifetime of it, by making them virtual. Now we see that VMware reports 100% CPU utilization of a windows 95 VM. This is because windows 95 doesn't know how to manage a CPU. For this reason software like rain or Waterfall, or CPUCool are introduced. The they are sending a HLT instruction to the CPU. This causes the CPU to halt and wait for new triggers to work. I've tested the mentioned programs in the VM, but they don't work, but generate errors. anyone has a valid workaround, solution ?? btw. I know that the best solutiuon is to replace the windows95 with windows XP. But in our situation that will take at least 5 years. Our windows 95 systems are running factory process control software.....

    Read the article

  • F1 Pit Pragmatics

    - by mikef
    "I hate computers. No, really, I hate them. I love the communications they facilitate, I love the conveniences they provide to my life. but I actually hate the computers themselves." - Scott Merrill, 'I hate computers: confessions of a Sysadmin' If Scott's goal was to polarize opinion and trigger raging arguments over the 'real reasons why computers suck', then he certainly succeeded. Impassioned vitriol sits side-by-side with rational debate. Yet Scott's fundamental point is absolutely on the money - Computers are a means to an end. The IT industry is finally starting to put weight behind the notion that good User Experience is an absolutely crucial goal, a cause championed by the likes of Microsoft's Bill Buxton, and which Apple's increasingly ubiquitous touch screen interface exemplifies. However, that doesn't change the fact that, occasionally, you just have to man up and deal with complex systems. In fact, sometimes you just need to sacrifice everything else in the name of performance. You'll find a perfect example of this Faustian bargain in Trevor Clarke's fascinating look into the (diabolical) IT infrastructure of modern F1 racing - high performance, high availability. high everything. To paraphrase, each car has up to 100 sensors, transmitting around 30Gb of data over the course of a race (70% in real-time). This data is then processed by no less than 3 servers (per car) so that the engineers in the pit have access to telemetry, strategy information, timing feeds, a connection back to the operations room in the team's home base - the list goes on. All of this while the servers are exposed "to carbon dust, oil, vibration, rain, heat, [and] variable power". Now, this is admittedly an extreme context where there's no real choice but to use complex systems where ease-of-use is, at best, a secondary concern. The flip-side is seen in small-scale personal computing such as that seen in Apple's iDevices, which are incredibly intuitive but limited in their scope. In terms of what kinds of systems they prefer to use, I suspect that most SysAdmins find themselves somewhere along this axis of Power vs. Usability, and which end of this axis you resonate with also hints at where you think the IT industry should focus its energy. Do you see yourself in the F1 pit, making split-second decisions, wrestling with information flows and reticent hardware to bend them to your will? If so, I imagine you feel that computers are subtle tools which need to be tuned and honed, using the advanced knowledge possessed only by responsible SysAdmins (If you have an iPhone, I suspect it's jail-broken). If the machines throw enigmatic errors, it's the price of flexibility and raw power. Alternatively, would you prefer to have your role more accessible, with users empowered by knowledge, spreading the load of managing IT environments? In that case, then you want hardware and software to have User Experience as their primary focus, and are of the "means to an end" school of thought (you're probably also fed up with users not listening to you when you try and help). At its heart, the dichotomy is between raw power (which might be difficult to use) and ease-of-use (which might have some limitations, but you can be up and running immediately). Of course, the ultimate goal is a fusion of flexibility, power and usability all in one system. It's achievable in specific software environments, and Red Gate considers it a target worth aiming for, but in other cases it's a goal right up there with cold fusion. I think it'll be a long time before we see it become ubiquitous. In the meantime, are you Power-Hungry or a Champion of Usability? Cheers, Michael Francis Simple Talk SysAdmin Editor

    Read the article

  • Big Data – Basics of Big Data Analytics – Day 18 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the various components in Big Data Story. In this article we will understand what are the various analytics tasks we try to achieve with the Big Data and the list of the important tools in Big Data Story. When you have plenty of the data around you what is the first thing which comes to your mind? “What do all these data means?” Exactly – the same thought comes to my mind as well. I always wanted to know what all the data means and what meaningful information I can receive out of it. Most of the Big Data projects are built to retrieve various intelligence all this data contains within it. Let us take example of Facebook. When I look at my friends list of Facebook, I always want to ask many questions such as - On which date my maximum friends have a birthday? What is the most favorite film of my most of the friends so I can talk about it and engage them? What is the most liked placed to travel my friends? Which is the most disliked cousin for my friends in India and USA so when they travel, I do not take them there. There are many more questions I can think of. This illustrates that how important it is to have analysis of Big Data. Here are few of the kind of analysis listed which you can use with Big Data. Slicing and Dicing: This means breaking down your data into smaller set and understanding them one set at a time. This also helps to present various information in a variety of different user digestible ways. For example if you have data related to movies, you can use different slide and dice data in various formats like actors, movie length etc. Real Time Monitoring: This is very crucial in social media when there are any events happening and you wanted to measure the impact at the time when the event is happening. For example, if you are using twitter when there is a football match, you can watch what fans are talking about football match on twitter when the event is happening. Anomaly Predication and Modeling: If the business is running normal it is alright but if there are signs of trouble, everyone wants to know them early on the hand. Big Data analysis of various patterns can be very much helpful to predict future. Though it may not be always accurate but certain hints and signals can be very helpful. For example, lots of data can help conclude that if there is lots of rain it can increase the sell of umbrella. Text and Unstructured Data Analysis: unstructured data are now getting norm in the new world and they are a big part of the Big Data revolution. It is very important that we Extract, Transform and Load the unstructured data and make meaningful data out of it. For example, analysis of lots of images, one can predict that people like to use certain colors in certain months in their cloths. Big Data Analytics Solutions There are many different Big Data Analystics Solutions out in the market. It is impossible to list all of them so I will list a few of them over here. Tableau – This has to be one of the most popular visualization tools out in the big data market. SAS – A high performance analytics and infrastructure company IBM and Oracle – They have a range of tools for Big Data Analysis Tomorrow In tomorrow’s blog post we will discuss about very important components of the Big Data Ecosystem – Data Scientist. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Oracle celebrates a successful Oracle CloudWorld in Bogotá

    - by yaldahhakim
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 written by: Diana Tamayo Tovar Oracle CloudWorld Bogotá began with scattered showers, rain and strong winds, inviting Colombians to spend a whole day in the social, mobile and complete world of Oracle Cloud. The event took place on November 6th with 807 attendees, 15 media representatives and 65 partners, who gathered to share the business value of Cloud along with Oracle executives and Colombian market leaders. Line-of-business leaders in sales and marketing, customer service and support, HR and talent management, and finance and operations, shared their ideas with Colombian customers, giving them a chance to learn, discover and engage with the tools, trends and concepts of Cloud. The highlights of the event included the presence of keynote speakers such as Bob Evans, Chief Communications Officer, and a customer testimonial session with top business leaders from insurance, finances, retail, communications and health Colombian industries, who shared their innovation experiences and success stories on workforce empowerment, talent management, cloud security, social engagement and productivity, providing best case scenarios on how Oracle has helped them transform their business with technologies like cloud, social collaboration and mobile applications. The keynote session was preceded by a customer success story from one of the largest virtual network operator in the country, providing an interesting case study of mobile banking innovation and a great customer testimonial of the importance of cross industry strategies and cloud technology. The event provided five different tracks on the main trends of how companies communicate and engage with different audiences, providing a different perspective on the importance of empowering brands through their customers, trusting and investing in technology for growth, while Oracle University shared their knowledge with “Oracle Cloud Fundamentals” a training lesson regarding Java Cloud, Database Cloud and other Oracle Cloud product technologies and solutions. The rainy day scenario included sideshows of aerial acrobatics and speed painting performances to recreate the environment of modern and flexible Cloud Solutions in a colorful and creative way. Oracle CloudWorld Bogotá was a great opportunity to expose invalid cloud Myths and the main concerns of the Colombian customers towards cloud, considering IDC Latin America studies stating that 93% of Colombian business leaders are interested in cloud but only 47% understand its business value. Spending a day in the cloud with 6 demogrounds stations, conference sessions, interesting case studies and customer testimonials will surely widen the endless market opportunities for Colombian customers, leaving them amazed with how Oracle Cloud works towards integration with other environments, non oracle applications, social media and mobile devices with bulletproof security infrastructure. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Type of computer for a developer on the road

    - by nabucosound
    Hi developers: I am planning to be traveling through eurasia and asia (russia, china, korea, japan, south east asia...) for a while and, although there are plenty of marvelous things to see and to do, I must keep on working :(. I am a python developer, dedicated mainly to web projects. I use django, sqlite3, browsers, and ocassionaly (only if I have no choice) I install postgres, mysql, apache or any other servers commonly used in the internets. I do my coding on vim, use ssh to connect, lftp to transfer files, IRC, grep/ack... So I spend most of my time in the terminal shells. But I also use IM or Skype to communicate with my clients and peers, as well as some other software (that after all is not mandatory for my day-to-day work). I currently work with a Macbook Pro (3 years old now) and so far I am very happy with the performance. But I don't want to carry it if I am going to be "on transit" for long time, it is simply huge and heavy for what I am planning to load in my rather small backpack (while traveling, less is more, you know). So here I am reading all kind of opinions about netbooks, because at first sight this is the kind of computer I thought I had to choose. I am going to use Linux for it, Microsoft is not my cup of tea and Mac is not available for them, unless I were to buy a Macbook air, something that I won't do because if I am robbed or rain/dust/truck loaders break it I would burst in tears. I am concerned about wifi performance and connectivity, I am going to use one of those linux distros/tools to hack/test on "open" networks (if you know what I mean) in case I am not in a place with real free wifi access and I find myself in an emergency. CPU speed should be acceptable, but since I don't plan to run Photoshop or expensive IDEs, I guess most of the time I won't be overloading the machine. Apart from this, maybe (surely) I am missing other features to consider. With that said (sorry about the length) here it comes my question, raised from a deep ignorance regarding the wars betweeb betbooks vs notebooks (I assume tablet PCs are not for programming yet): If I buy a netbook will I have to throw it away after 1 month on the road and buy a notebook? Or will I be OK? Thanks! Hector Update I have received great feedback so far! I would like to insist on the fact that I will be traveling through many different countries and scenarios. I am sure that while in Japan I will be more than fine with anything related to technology, connectivity, etc. But consider that I will be, for example, on a train through Russia (transsiberian) and will cross Mongolia as well. I will stay in friends' places sometimes, but most of the time I will have to work from hostel rooms, trains, buses, beaches (hey this last one doesn't sound too bad hehe!). I think some of your answers guys seem to focus on the geek part but loose the point of this "on the road" fact. I am very aware and agree that netbooks suck compared to notebooks, but what I am trying to do here is to find a balance and discover your experiences with netbooks to see first hand if a netbook will be a fail in the mid-long term of the trip for my purposes. So I have resumed the main concepts expressed here on this small list, in no particular order: keyboard/touchpad feel: I use vim so no need of moving mouse pointers that much, unless I am browsing the web, but intensive use of keyboard screen real state: again, terminal work for most of the time battery life: I think something very important weight/size: also very important looks not worth stealing it, don't give a shit if it is lost/stolen/broken: this may depend on kind of person, your economy, etc. Also to prevent losing work, I will upload EVERYTHING to the cloud whenever I'll have a chance. wifi: don't want to discover my wifi is one of those that cannot deal with half the routers on this planet or has poor connectivity. Thanks again for your answers and comments!

    Read the article

  • Maven Multi-Module builds not honoring failsafe-maven-plugin?

    - by Mike Cornell
    I recently discovered that Hudson was not the problem. In actuality it was Maven itself as the multi-module build was causing the build failure, not Hudson. I just hadn't noticed where the issue actually existed. Leaving the original question here. I'm using the failsafe-maven-plugin to run some integration tests. The difference between failsafe and surefire is that failsafe allows failures and does not fail the build. On my nightly builds there are occasions that a service the integration tests use might be down. In normal builds, the failsafe plugin would let the build continue since the integration tests are allowed to fail. However, Hudson does not seem to respect this and stops the build and produces rain. I tried to turn the failsafe tests off on nightly builds using -DskipITs. This appears to fail since I'm in a multi module build. Any ideas on how to get Maven to respect that these tests can fail even though they're part of a specific module? The project structure is as follows: -parent \-jar \-jar (where integration tests run) \-war \-ear

    Read the article

  • Moral fits the story or suggest me a nice moral?

    - by Gobi
    A 25 year old son was sitting beside his old father in a train one day. When the train was about to leave, all the passengers started settling down in their seats. The son was filled with joy and anxiety. He was seated by the window. He put his hand out and felt the breeze and screamed, “ Papa look at all the trees, they are moving behind”. The old father smiled and admired his son’s feelings. Beside the old man, a couple was also travelling and observed this strange behavior. They found something awkward and childish in the behavior of this 25 year old man. All of a sudden, the son shouted again “Papa see! The clouds are moving about; there is a pond down and many cows are drinking it’s water”. It soon started drizzling. Once again, the young man felt exited and said “papa, I can see and feel the rain drops touching my hand”. The couple seeing this and feeling concerned, asked the old man “why don’t you consult a good doctor and treat your son; don’t you find something abnormally different in him ?” The old man replied, “Yes, I have provided the best treatment for my only boy. We are just returning from the hospital. I am happy for today is the day he has received his sense of sight. It’s for the first time my son is seeing and relishing these little wonders which we have been watching and ignoring in our routine life!” The couple had no words to reply and felt sorry for their remarks. Moral of the story: “ “don’t judge a book by its cover”. is this the moral fits the story or provide me some moral for this story :)

    Read the article

  • Can't add/remove items from a collection while foreach is iterating over it

    - by flockofcode
    If I make my own implementation of IEnumerator interface, then I am able ( inside foreach statement )to add or remove items from a albumsList without generating an exception.But if foreach statement uses IEnumerator supplied by albumsList, then trying to add/delete ( inside the foreach )items from albumsList will result in exception: class Program { static void Main(string[] args) { string[] rockAlbums = { "rock", "roll", "rain dogs" }; ArrayList albumsList = new ArrayList(rockAlbums); AlbumsCollection ac = new AlbumsCollection(albumsList); foreach (string item in ac) { Console.WriteLine(item); albumsList.Remove(item); //works } foreach (string item in albumsList) { albumsList.Remove(item); //exception } } class MyEnumerator : IEnumerator { ArrayList table; int _current = -1; public Object Current { get { return table[_current]; } } public bool MoveNext() { if (_current + 1 < table.Count) { _current++; return true; } else return false; } public void Reset() { _current = -1; } public MyEnumerator(ArrayList albums) { this.table = albums; } } class AlbumsCollection : IEnumerable { public ArrayList albums; public IEnumerator GetEnumerator() { return new MyEnumerator(this.albums); } public AlbumsCollection(ArrayList albums) { this.albums = albums; } } } a) I assume code that throws exception ( when using IEnumerator implementation A supplied by albumsList ) is located inside A? b) If I want to be able to add/remove items from a collection ( while foreach is iterating over it), will I always need to provide my own implementation of IEnumerator interface, or can albumsList be set to allow adding/removing items? thank you

    Read the article

  • Going to the Score Cards - Exceptional DBA Awards 2011

    - by Rodney
    This year marks my 4th year as a judge for the Exceptional DBA Awards, founded by Red Gate in 2008 to "recognize the essential but often overlooked contributions of DBAs, the unsung heroes of the IT community." As a professional DBA myself I have been honored to participate as a judge. It is not an easy job because there is a voluminous amount of nominees from all over the world. Each judge has to read through every word of the nominee's answers, deciding what makes each person special and stand out amongst their peers. What drives them? What single element of their submission will shine above all others? It is my hope that what I am about to divulge to you as a judge will prompt you to think about yourself or someone you know and decide that you may be the exceptional DBA who can take home the gold at this year's award ceremony in Seattle. We are more than a few weeks into the nomination process and there are quite a number of submissions already. I can not tell you how many as that would not be fair. I can say it is not 1 million or more. I can also say that it is not 100,000. But that is all I can say about that. However, I can tell you that it is enough this year that we are breaking records on the number of people who have been influenced, inspired or intrigued by the awards in the past. I remember them all like it were yesterday. fuzzy thought cloud here. It was a rainy day in Seattle (all memories for each award ceremony will start thusly) and I was in the hotel going over my notes on what I wanted to say about the winner of the 2008 Red Gate Exceptional DBA Award. The notes were on index cards that I had either bought or stolen from my wife, I do not recall, but I was nervous which was unlike me. This was, after all, a big night for the winner. Of course, we, the judges and the SQL community, had already decided the winner and now all that remained was to present the award. The room was packed. It was Casino night, sponsored by sqlservercentral.com. Money (fake), drinks (not fake) and camaraderie flowed through the room. Dan McClain won the award that year. He worked for Anheuser-Busch at the time. I promise that did not influence my decision. We presented Dan with the award. He was very proud of this achievement, rightfully so, as was the SQL community for him. I spoke with Dan throughout the conference and realized how huge this award was for him, not just personally but professionally. It was a rainy day in Seattle in 2009 and I was nervous. I was asked to speak to a group of people again as a judge for the Exceptional DBA Awards. This year, Josef Richberg would be the recipient of the award, but he would not be able to attend. We all prayed for him as he fought through an illness and congratulated him for his accomplishments as a DBA for his company. He got better and sallied forth and continued to give back to the SQL community that he saw as one big family. In 2010, and I am getting ahead of myself, he was asked to be a judge himself for the very award he had just received the year before. It was a sunny day in Seattle and I missed it, because it was in July and I was not there. It was a rainy day in Seattle and it is 2010 and Tracy Hamlin enters a submission that blows this judge away. She is managing a 50 Terabyte distributed database ("50 Gigabytes! Are you kidding me!!!", Rodney jokes.)  and loves her daily job as a DBA working with developers, mentoring them and teaching them best practices with kindness and patience. She is a people person who just happens to have 10+ years experience with RDBMS'. She wins the award and goes on to be recognized as famous at PASS. It will be a rainy day in Seattle this year when I sit amongst my old constituent judges and friends, Brad McGehee, http://www.simple-talk.com/books/sql-books/how-to-become-an-exceptional-dba,-2nd-edition/, Steve Jones, whom we all know and love at http://www.sqlservercentral.com and a young upstart to the SQL Community, this cat named Brent Ozar to announce the 2011 winner. I personally have not heard of Brent but I am told I have interviewed him for a DBA position several years ago and turned him down, http://www.brentozar.com/archive/2011/05/exceptional-dba-contest/ . I hope that did not jeopardize his future in the SQL world. I am a big hearted oaf and would feel horrible. Hopefully I will meet him at PASS and we can work this all out and I can help him get a DBA job. The rain has stopped and a new year is upon us. The stakes are high...the competition is fierce...the rewards are incredible. The entry form awaits you. http://www.exceptionaldba.com/ I very much look forward to meeting you and presenting the award to you in front of hundreds of your envious but proud peers as the new Exceptional DBA for 2011 at the PASS Summit. Here is what you could win: The Exceptional DBA of the Year receives full conference registration for the 2011 PSS Summit in Seattle, where the awards ceremony will take place, four nights' hotel accommodation, and $300 towards travel expenses. They will also be featured on Simple-Talk. Are you ready? Are you nervous?

    Read the article

  • Spotlight on Claims: Serving Customers Under Extreme Conditions

    - by [email protected]
    Oracle Insurance's director of marketing for EMEA, John Sinclair, recently attended the CII Spotlight on Claims event in London. Bad weather and its implications for the insurance industry have become very topical as the frequency and diversity of natural disasters - including rains, wind and snow - has surged across Europe this winter. On England's wettest day on record, the county of Cumbria was flooded with 12 inches of rain within 24 hours. Freezing temperatures wreaked havoc on European travel, causing high speed TVG trains to break down and stranding hundreds of passengers under the English Chanel in a tunnel all night long without heat or electricity. A storm named Xynthia thrashed France and surrounding countries with hurricane force, flooding ports and killing 51 people. After the Spring Equinox, insurers may have thought the worst had past. Then came along Eyjafjallajökull, spewing out vast quantities of volcanic ash in what is turning out to be one of most costly natural disasters in history. Such extreme events challenge insurance companies' ability to service their customers just when customers need their help most. When you add economic downturn and competitive pressures to the mix, insurers are further stretched and required to continually learn and innovate to meet high customer expectations with reduced budgets. These and other issues were hot topics of discussion at the recent "Spotlight on Claims" seminar in London, focused on how weather is affecting claims and the insurance industry. The event was organized by the CII (Chartered Insurance Institute), a group with 90,000 members. CII has been at the forefront in setting professional standards for the insurance industry for over a century. Insurers came to the conference to hear how they could better serve their customers under extreme weather conditions, learn from the experience of their peers, and hear about technological breakthroughs in climate modeling, geographic intelligence and IT. Customer case studies at the conference highlighted the importance of effective and constant communication in handling the overflow of catastrophe related claims. First and foremost is the need to rapidly establish initial communication with claimants to build their confidence in a positive outcome. Ongoing communication then needs to be continued throughout the claims cycle to mange expectations and maintain ownership of the process from start to finish. Strong internal communication to support frontline staff was also deemed critical to successful crisis management, as was communication with the broader insurance ecosystem to tap into extended resources and business intelligence. Advances in technology - such web based systems to access policies and enter first notice of loss in the field - as well as customer-focused self-service portals and multichannel alerts, are instrumental in improving customer satisfaction and helping insurers to deal with the claims surge, which often can reach four or more times normal workloads. Dynamic models of the global climate system can now be used to better understand weather-related risks, and as these models mature it is hoped that they will soon become more accurate in predicting the timing of catastrophic events. Geographic intelligence is also being used within a claims environment to better assess loss reserves and detect fraud. Despite these advances in dealing with catastrophes and predicting their occurrence, there will never be a substitute for qualified front line staff to deal with customers. In light of pressures to streamline efficiency, there was debate as to whether outsourcing was the solution, or whether it was better to build on the people you have. In the final analysis, nearly everybody agreed that in the future insurance companies would have to work better and smarter to keep on top. An appeal was also made for greater collaboration amongst industry participants in dealing with the extreme conditions and systematic stress brought on by natural disasters. It was pointed out that the public oftentimes judged the industry as a whole rather than the individual carriers when it comes to freakish events, and that all would benefit at such times from the pooling of limited resources and professional skills rather than competing in silos for competitive advantage - especially the end customer. One case study that stood out was on how The Motorists Insurance Group was able to power through one of the most devastating catastrophes in recent years - Hurricane Ike. The keys to Motorists' success were superior people, processes and technology. They did a lot of upfront planning and invested in their people, creating a healthy team environment that delivered "max service" even when they were experiencing the same level of devastation as the rest of the population. Processes were rapidly adapted to meet the challenge of the catastrophe and continually adapted to Ike's specific conditions as they evolved. Technology was fundamental to the execution of their strategy, enabling them anywhere access, on the fly reassigning of resources and rapid training to augment the work force. You can learn more about the Motorists experience by watching this video. John Sinclair is marketing director for Oracle Insurance in EMEA. He has more than 20 years of experience in insurance and financial services.

    Read the article

  • HDFC Bank's Journey to Oracle Private Database Cloud

    - by Nilesh Agrawal
    One of the key takeaways from a recent post by Sushil Kumar is the importance of business initiative that drives the transformational journey from legacy IT to enterprise private cloud. The journey that leads to a agile, self-service and efficient infrastructure with reduced complexity and enables IT to deliver services more closely aligned with business requirements. Nilanjay Bhattacharjee, AVP, IT of HDFC Bank presented a real-world case study based on one such initiative in his Oracle OpenWorld session titled "HDFC BANK Journey into Oracle Database Cloud with EM 12c DBaaS". The case study highlighted in this session is from HDFC Bank’s Lending Business Segment, which comprises roughly 50% of Bank’s top line. Bank’s Lending Business is always under pressure to launch “New Schemes” to compete and stay ahead in this segment and IT has to keep up with this challenging business requirement. Lending related applications are highly dynamic and go through constant changes and every single and minor change in each related application is required to be thoroughly UAT tested certified before they are certified for production rollout. This leads to a constant pressure in IT for rapid provisioning of UAT databases on an ongoing basis to enable faster time to market. Nilanjay joined Sushil Kumar, VP, Product Strategy, Oracle, during the Enterprise Manager general session at Oracle OpenWorld 2012. Let's watch what Nilanjay had to say about their recent Database cloud deployment. “Agility” in launching new business schemes became the key business driver for private database cloud adoption in the Bank. Nilanjay spent an hour discussing it during his session. Let's look at why Database-as-a-Service(DBaaS) model was need of the hour in this case  - Average 3 days to provision UAT Database for Loan Management Application Silo’ed UAT environment with Average 30% utilization Compliance requirement consume UAT testing resources DBA activities leads to $$ paid to SI for provisioning databases manually Overhead in managing configuration drift between production and test environments Rollout impact/delay on new business initiatives The private database cloud implementation progressed through 4 fundamental phases - Standardization, Consolidation, Automation, Optimization of UAT infrastructure. Project scoping was carried out and end users and stakeholders were engaged early on right from planning phase and including all phases of implementation. Standardization and Consolidation phase involved multiple iterations of planning to first standardize on infrastructure, db versions, patch levels, configuration, IT processes etc and with database level consolidation project onto Exadata platform. It was also decided to have existing AIX UAT DB landscape covered and EM 12c DBaaS solution being platform agnostic supported this model well. Automation and Optimization phase provided the necessary Agility, Self-Service and efficiency and this was made possible via EM 12c DBaaS. EM 12c DBaaS Self-Service/SSA Portal was setup with required zones, quotas, service templates, charge plan defined. There were 2 zones implemented - Exadata zone  primarily for UAT and benchmark testing for databases running on Exadata platform and second zone was for AIX setup to cover other databases those running on AIX. Metering and Chargeback/Showback capabilities provided business and IT the framework for cloud optimization and also visibility into cloud usage. More details on UAT cloud implementation, related building blocks and EM 12c DBaaS solution are covered in Nilanjay's OpenWorld session here. Some of the key Benefits achieved from UAT cloud initiative are - New business initiatives can be easily launched due to rapid provisioning of UAT Databases [ ~3 hours ] Drastically cut down $$ on SI for DBA Activities due to Self-Service Effective usage of infrastructure leading to  better ROI Empowering  consumers to provision database using Self-Service Control on project schedule with DB end date aligned to project plan submitted during provisioning Databases provisioned through Self-Service are monitored in EM and auto configured for Alerts and KPI Regulatory requirement of database does not impact existing project in queue This table below shows typical list of activities and tasks involved when a end user requests for a UAT database. EM 12c DBaaS solution helped reduce UAT database provisioning time from roughly 3 days down to 3 hours and this timing also includes provisioning time for database with production scale data (ranging from 250 G to 2 TB of data) - And it's not just about time to provision,  this initiative has enabled an agile, efficient and transparent UAT environment where end users are empowered with real control of cloud resources and IT's role is shifted as enabler of strategic services instead of being administrator of all user requests. The strong collaboration between IT and business community right from planning to implementation to go-live has played the key role in achieving this common goal of enterprise private cloud. Finally, real cloud is here and this cloud is accompanied with rain (business benefits) as well ! For more information, please go to Oracle Enterprise Manager  web page or  follow us at :  Twitter | Facebook | YouTube | Linkedin | Newsletter

    Read the article

  • SQL SERVER – SSMS: Database Consistency History Report

    - by Pinal Dave
    Doctor and Database The last place I like to visit is always a hospital. With the monsoon season starting, intermittent rains, it has become sort of a routine to get a cycle of fever every other year (seriously I hate it). So when I visit my doctor, it is always interesting in the way he quizzes me. The routine question of – “How many days have you had this?”, “Is there any pattern?”, “Did you drench in rain?”, “Do you have any other symptom?” and so on. The idea here is that the doctor wants to find any anomaly or a pattern that will guide him to a viral or bacterial type. Most of the time they get it based on experience and sometimes after a battery of tests. So if there is consistent behavior to your problem, there is always a solution out. SQL Server has its way to find if the server data / files are in consistent state using the DBCC commands. Back to SQL Server In real life, Database consistency check is one of the critical operations a DBA generally doesn’t give much priority. Many readers of my blogs have asked many times, how do we know if the database is consistent? How do I read output of DBCC CHECKDB and find if everything is right or not? My common answer to all of them is – look at the bottom of checkdb (or checktable) output and look for below line. CHECKDB found 0 allocation errors and 0 consistency errors in database ‘DatabaseName’. Above is a “good sign” because we are seeing zero allocation and zero consistency error. If you are seeing non-zero errors then there is some problem with the database. Sample output is shown as below: CHECKDB found 0 allocation errors and 2 consistency errors in database ‘DatabaseName’. repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKDB (DatabaseName). If we see non-zero error then most of the time (not always) we get repair options depending on the level of corruption. There is risk involved with above option (repair_allow_data_loss), that is – we would lose the data. Sometimes the option would be repair_rebuild which is little safer. Though these options are available, it is important to find the root cause to the problem. In standard report, there is a report which can show the history of checkdb executed for the selected database. Since this is a database level report, we need to right click on database, click Reports, click Standard Reports and then choose “Database Consistency History” report. The information in this report is picked from default trace. If default trace is disabled or there is no checkdb run or information is not there in default trace (because it’s rolled over), we would get report like below. As we can see report says it very clearly: Currently, no execution history of CHECKDB is available or default trace is not enabled. To demonstrate, I have caused corruption in one of the database and did below steps. Run CheckDB so that errors are reported. Fix the corruption by losing the data using repair option Run CheckDB again to check if corruption is cleared. After that I have launched the report and below is what we would see. If you are lazy like me and don’t want to run the report manually for each database then below query would be handy to provide same report for all database. This query is runs behind the scenes by the report. All I have done is remove the filter for database name (at the last – highlighted). DECLARE @curr_tracefilename VARCHAR(500); DECLARE @base_tracefilename VARCHAR(500); DECLARE @indx INT; SELECT @curr_tracefilename = path FROM sys.traces WHERE is_default = 1; SET @curr_tracefilename = REVERSE(@curr_tracefilename); SELECT @indx  = PATINDEX('%\%', @curr_tracefilename) ; SET @curr_tracefilename = REVERSE(@curr_tracefilename); SET @base_tracefilename = LEFT( @curr_tracefilename,LEN(@curr_tracefilename) - @indx) + '\log.trc'; SELECT  SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),36, PATINDEX('%executed%',TEXTData)-36) AS command ,       LoginName ,       StartTime ,       CONVERT(INT,SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%found%',TEXTData) +6,PATINDEX('%errors %',TEXTData)-PATINDEX('%found%',TEXTData)-6)) AS errors ,       CONVERT(INT,SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%repaired%',TEXTData) +9,PATINDEX('%errors.%',TEXTData)-PATINDEX('%repaired%',TEXTData)-9)) repaired ,       SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%time:%',TEXTData)+6,PATINDEX('%hours%',TEXTData)-PATINDEX('%time:%',TEXTData)-6)+':'+SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%hours%',TEXTData) +6,PATINDEX('%minutes%',TEXTData)-PATINDEX('%hours%',TEXTData)-6)+':'+SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%minutes%',TEXTData) +8,PATINDEX('%seconds.%',TEXTData)-PATINDEX('%minutes%',TEXTData)-8) AS time FROM::fn_trace_gettable( @base_tracefilename, DEFAULT) WHERE EventClass = 22 AND SUBSTRING(TEXTData,36,12) = 'DBCC CHECKDB' -- AND DatabaseName = @DatabaseName; Don’t get worried about the logic above. All it is doing is reading the trace files, parsing below entry and getting out information for underlined words. DBCC CHECKDB (CorruptedDatabase) executed by sa found 2 errors and repaired 0 errors. Elapsed time: 0 hours 0 minutes 0 seconds.  Internal database snapshot has split point LSN = 00000029:00000030:0001 and first LSN = 00000029:00000020:0001. Hopefully now onwards you would run checkdb and understand the importance of it. As responsible DBAs I am sure you are already doing it, let me know how often do you actually run them on you production environment? Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL Tagged: SQL Reports

    Read the article

  • BIP and Mapviewer Mash Up I

    - by Tim Dexter
    I was out in Yellowstone last week soaking up various wildlife and a bit too much rain ... good to be back until the 95F heat yesterday. Taking a little break from the Excel templates; the dev folks are planing an Excel patch in the next week or so that will add a mass of new functionality. At the risk of completely mis leading you I'm going to hang back a while. What I have written so far holds true and will continue to do so. This week, I have been mostly eating 'mapviewer' ... answers on a post card please, TV show and character. I had a request to show how BIP can call mapviewer and render a dynamic map in an output. So I hit the books and colleagues for some answers. Mapviewer is Oracle's geographic information system, hereby known as GIS. I use it a lot in our BIEE demos where the interaction with the maps is very impressive. Need a map of California and its congressional districts? I have contacts; Jerry and David with their little black box of maps. Once in my possession I can build highly interactive, clickable maps that allow the user to drill into more information using a very friendly interface driving BIEE content and navigation. But what about maps in BIP output? Bryan Wise, who has written some articles on this blog did some work a while back with the PL/SQL API interface. The extract for the report called a function that in turn called the mapviewer server, passing a set of mapping requirements, it then returned a URL to a cached copy of that map. Easy to then have BIP render that image. Thats still very doable. You need to install a couple of packages and then load the mapviewer java APIs into the database. Then you can write your function to the APIs. A little involved? Maybe, but the database is doing all the heavy lifting for you. I thought I would investigate another method for getting the maps back into BIP. There is a URL interface you can call, this involves building an XML message to be passed to the mapviewer server. It's pretty straightforward to use on the mapviewer side. On the BIP side things are little more tricksy. After some unexpected messing about I finally got the ubiquitous Hello World map to render using the URL method. Not the most exciting map in the world, lots of ocean and a rather long URL to get it to render. http://127.0.0.1:9704/mapviewer/omserver?xml_request=%3Cmap_request%20title=%22Hello%20World%22%20datasource=%22cagis%22%20format=%22GIF_STREAM%22/%3E Notice all of the encoding in the URL string to handle the spaces, quotes, etc. All necessary to get BIP to make the call to the mapviewer server correctly without truncating the URL if it hits a real space rather than a %20. With that in mind constructing the URL was pretty simple. I'm not going to get into the content of the URL too much, for that you need to bone up on the mapviewer XML API. Check out the home page here and the documentation here. To make the template portable I used the standard CURRENT_SERVER_URL parameter from the BIP server and declared that in my template. <?param@begin:CURRENT_SERVER_URL;'myserver'?> Ignore the 'myserver', that was just a dummy value for testing at runtime it will resolve to: 'http://yourserver:port/xmlpserver' Not quite what we need as mapviewer has its own server path, in my case I needed 'mapviewer/omserver?xml_request=' as the fixed path to the mapviewer request URL. A little concatenation and substringing later I came up with <?param@begin:mURL;concat(substring($CURRENT_SERVER_URL,1,22),'mapviewer/omserver?xml_request=')?> Thats the basic URL that I can then build on. To get the Hello World map I need to add the following: <map_request title="Hello World" datasource="cagis" format="GIF_STREAM"/> Those angle brackets were the source of my headache, BIPs XSLT engine was attempting to process them rather than just pass them. Hok Min to the rescue ... again. I owe him lunch when I get out to HQ again! To solve the problem, I needed to escape all the characters and white space and then use native XSL to assign the string to a parameter. <xsl:param xdofo:ctx="begin"name="pXML">%3Cmap_request%20title=%22Hello%20World%22 %20datasource=%22cagis%22%20format=%22GIF_STREAM%22/%3E</xsl:param> I did not need to assign it to a parameter but I felt that if I were going to do anything more serious than Hello World like plotting points of interest on the map. I would need to dynamically build the URL, so using a set of parameters or variables that I then concatenated would be easier. Now I had the initial server string and the request all I then did was combine the two using a concat: concat($mURL,$pXML) Embedding that into an image tag: <fo:external-graphic src="url({concat($mURL,$pXML)})"/> and I was done. Notice the curly braces to get the concat evaluated prior to the image call. As you will see next time, building the XML message to go onto the URL can get quite complex but I have used it with some data. Ultimately, it would be easier to build an extension to BIP to handle the data to be plotted, it would then build the XML message, call mapviewer and return a URL to the map image for BIP to render. More on that next time ...

    Read the article

  • Waiting for Windows 8: A Long, Hot Summer

    - by andrewbrust
    Microsoft has revealed some things about Windows 8, and revealed a part of the developer story for new Windows 8 “tailored,” “immersive” applications.  In retrospect, very little was shared.  The bit that was revealed to us is that those applications can be developed using a combination of HTML 5 and JavaScript.  Not much else was said, except that additional details would be revealed at Microsoft’s //Build/ conference in Anaheim, California in September. This has left a lot of people in suspense, and it seems that suspended state is going to last all summer.  The problem, of course, is that in the absence of hard information, people fill the void with Speculation, Rumor and Gloom.  That’s a bit like Fear, Uncertainty and Doubt, except that it’s self-imposed by the Microsoft community and not planted by Microsoft’s competitors. This is a less-than-perfect situation.  Not only is it causing developers to worry about the value of their skill sets, but I am already hearing from consulting shops that customers are getting nervous too and, in extreme cases, opting for non-Microsoft tools for their projects as a result.  I’m also hearing from dev tool ISVs that sales have suffered as a result. It’s quite possible that the customers moving off .NET wanted to do so anyway and it’s also possible that dev tool ISVs are suffering slower sales this year due a slowed rate of economic recovery. Without hard information, tend to people interpret things negatively.  Actually, that’s the major point in all of this. While there is multitude of opinions about what the Windows 8 development platform will look like once fully revealed, there is an emerging consensus around one thing: it sure would help if Microsoft revealed more of its strategy…just enough to quash absurd rumors, stabilize the .NET ecosystem and get people to stay calm. We’ve had some reassurances thus far: there will be a Windows desktop mode; we’ll still have Windows Explorer, we’ll still run Office, we’ll still have a task bar, and all the skills and tools we use now will still work there.  But with reassurances like that…people still feel insecure.  Because telling us that Windows 8 will have what is essentially a “classic” mode sure makes it sound like today’s skill sets will soon be “classic” too…and then maybe they’ll just become obsolete. Humans find change scary; it’s natural.  And when left alone with their fears – because no one is saying anything to dispel them – people can go from frightened to paranoid, and can start to viewing things in a downright conspiratorial light.  It would be great if Microsoft stepped into the void now and told us what is coming – especially because whatever they tell us is bound to be at least a little better than what people think they are going to hear. I don’t know what the announcements will be, but I do have it on authority, from a number of sources, that Microsoft isn’t gong to talk until //Build/.  That means no news until September September 13th.  Nothing until after Labor Day.  You get zippo until after the Back-to-School sales are done. What to do?  Try not to let the dark voices of gloom and doom fill your head.  Even in the absence of answers, we still have some important facts: The .NET developer community is huge. Microsoft’s customers have major investments in .NET, and in .NET skills. Political infighting in Redmond might make for irrational decisions, but ultimately public companies can’t just alienate their advocates and piss off their customers.  Spite doesn’t trump fiduciary responsibility. The computing device markets are changing, software is changing, software business models are changing and developers are changing.  Microsoft has to keep up. The HTML + JavaScript community is huge too, and it includes many of the “changed” developers. Public companies can’t ignore new markets nor the popular standards that can help them enter those new markets.  Loyalty doesn’t trump fiduciary responsibility either. If Microsoft can appeal to new developers, then it should. If Microsoft can keep catering to its existing developers and customers -- not just through legacy support, but also through empowering futures -- then it probably will. You don’t have to shove your old friends out into the rain to make room for new ones; you can bring those new constituents in under a bigger tent.  I hope Microsoft will enlarge the tent, and I have trouble imagining why it would not.

    Read the article

  • Delphi - TPerlRegEx / RegExBuddy Problem

    - by Brad
    I've got a problem with RegEx and Delphi 2k9 (Win32). I get the following Error: First chance exception at $7C812AFB. Exception class Exception with message 'TPerlRegEx.Compile() - Please specify a regular expression in RegEx first'. I've got the latest version of TPerlRegEx from the website. Using its defualt settings (Using DLL) I'm including demo source code. It's using the code generated by RegExBuddy, latest version. http://www.4shared.com/file/236428923/97478b61/googleresultstestdata.html http://www.4shared.com/file/236439483/e0acbe6d/Unit2.html Delphi FORM http://www.4shared.com/file/236439473/6734a2a2/Unit2.html Delphi PAS Thanks for any help -Brad Data is from Google External Keyword Tool RegEx could use some refinement... but works in RegExBuddy not in Delphi unit Unit2; interface uses Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms, Dialogs, StdCtrls, PerlRegEx; type TForm2 = class(TForm) Memo1: TMemo; Memo2: TMemo; Button1: TButton; procedure Button1Click(Sender: TObject); private { Private declarations } public { Public declarations } end; var Form2: TForm2; implementation {$R *.dfm} procedure TForm2.Button1Click(Sender: TObject); var Regex: TPerlRegEx; GroupIndex: Integer; begin Regex := TPerlRegEx.Create(nil); Regex.RegEx := 'criteria\.push\(new kpCriterion\(&#39;(?P<keyword>(.*?))&#39;, (?P<number1>(.*?)),'#13#10'''(?P<localsearch>(.*?))'', ''(?P<globalsearch>(.*?))'', (?P<localsearchnum>(.*?)), (?P<globalsearchnum>(.*?)), (.*+)'#13#10','#13#10'&#39;\$(?P<price>(.*?))&#39;, (?P<number2>(.*?)),'#13#10'&#39;(?P<range>(.*?))&#39;, (?P<number3>(.*+))'; Regex.Options := [preMultiLine]; Regex.Subject := memo1.text; if Regex.Match then begin memo2.Lines.Add('Matches Found'); repeat for GroupIndex := 0 to Regex.SubExpressionCount do begin memo2.lines.add( Regex.SubExpressions[GroupIndex]); //Add Results to memo // backreference text: Regex.SubExpressions[GroupIndex]; // backreference start: Regex.SubExpressionOffsets[GroupIndex]; // backreference length: Regex.SubExpressionLengths[GroupIndex]; end; until not Regex.MatchAgain; end else memo2.Lines.Add('No-Matches Found'); end; end. DFM object Form2: TForm2 Left = 0 Top = 0 Caption = 'Form2' ClientHeight = 247 ClientWidth = 480 Color = clBtnFace Font.Charset = DEFAULT_CHARSET Font.Color = clWindowText Font.Height = -11 Font.Name = 'Tahoma' Font.Style = [] OldCreateOrder = False PixelsPerInch = 96 TextHeight = 13 object Memo1: TMemo Left = 8 Top = 8 Width = 185 Height = 89 Lines.Strings = ( 'var showImpressions = false; var ' 'criteriaSuggestor = ' '&#39;sensei_keyword&#39;; var ' 'historicalTimePeriod = &#39;Mar ' '2009 - Feb 2010&#39;; var ' 'historicalStartMonth = 2; var ' 'impressionTimePeriod = ' '&#39;February&#39;; var ' 'criteriaGroupsArray = new Array(); ' 'var captchaError = false; var ' 'quotaExceeded = false;' 'var criteria = new Array();' 'var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.52' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.67' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.73' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.5' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.43' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.4' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.47' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' '));' 'criteria.push(new ' 'kpCriterion(&#39;thunderstorm&#3' '9;, 1.9117305278778076,' #39'201,000'#39', '#39'550,000'#39', 201000, ' '550000, 0.8666667' ',' '&#39;$0.49&#39;, 493102,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '5' ',' '&#39;&#39;' ',' 'kpView.MATCH_BROAD' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.57' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.7' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.57' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.42' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.47' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.46' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.43' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.36' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.43' '));' 'criteria.push(new ' 'kpCriterion(&#39;[thunderstorm]&' '#39;, 1.9117305278778076,' #39'33,100'#39', '#39'90,500'#39', 33100, 90500, ' '0.8666667' ',' '&#39;$0.49&#39;, 493102,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '3' ',' '&#39;&#39;' ',' 'kpView.MATCH_EXACT' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.52' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.67' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.73' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.5' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.43' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.4' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.47' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.45' '));' 'criteria.push(new ' 'kpCriterion(&#39;\42thunderstorm\' '042&#39;, 1.9117305278778076,' #39'201,000'#39', '#39'450,000'#39', 201000, ' '450000, 0.8666667' ',' '&#39;$0.49&#39;, 493102,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '5' ',' '&#39;&#39;' ',' 'kpView.MATCH_PHRASE' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.75' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.81' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.64' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.56' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.52' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.6' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.53' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.47' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.58' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.61' '));' 'criteria.push(new ' 'kpCriterion(&#39;thunderstorms&#' '39;, 1.8268921375274658,' #39'110,000'#39', '#39'201,000'#39', 110000, ' '201000, 0.8' ',' '&#39;$0.56&#39;, 559074,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '4' ',' '&#39;&#39;' ',' 'kpView.MATCH_BROAD' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.83' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.67' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.42' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.41' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.47' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.56' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.47' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.39' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.5' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.51' '));' 'criteria.push(new ' 'kpCriterion(&#39;[thunderstorms]&' '#39;, 1.8268921375274658,' #39'22,200'#39', '#39'40,500'#39', 22200, 40500, ' '0.8' ',' '&#39;$0.56&#39;, 559074,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '4' ',' '&#39;&#39;' ',' 'kpView.MATCH_EXACT' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.75' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.81' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.64' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.56' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.52' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.6' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.53' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.47' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.58' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.61' '));' 'criteria.push(new ' 'kpCriterion(&#39;\42thunderstorms' '\042&#39;, 1.8268921375274658,' #39'110,000'#39', '#39'165,000'#39', 110000, ' '165000, 0.8' ',' '&#39;$0.56&#39;, 559074,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '4' ',' '&#39;&#39;' ',' 'kpView.MATCH_PHRASE' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.71' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.73' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.92' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.7' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.75' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.68' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.77' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.79' '));' 'criteria.push(new ' 'kpCriterion(&#39;lightning ' 'storm&#39;, 1.774579644203186,' #39'49,500'#39', '#39'90,500'#39', 49500, 90500, ' '0.73333335' ',' '&#39;$0.54&#39;, 535666,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '5' ',' '&#39;&#39;' ',' 'kpView.MATCH_BROAD' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.76' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.97' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.98' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.84' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.68' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.86' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.81' '));' 'criteria.push(new ' 'kpCriterion(&#39;[lightning ' 'storm]&#39;, 1.774579644203186,' #39'12,100'#39', '#39'22,200'#39', 12100, 22200, ' '0.73333335' ',' '&#39;$0.54&#39;, 535666,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '5' ',' '&#39;&#39;' ',' 'kpView.MATCH_EXACT' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.68' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.72' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.81' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.85' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.92' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.81' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.67' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.71' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.65' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.76' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.73' '));' 'criteria.push(new ' 'kpCriterion(&#39;\42lightning ' 'storm\042&#39;, ' '1.774579644203186,' #39'33,100'#39', '#39'60,500'#39', 33100, 60500, ' '0.73333335' ',' '&#39;$0.54&#39;, 535666,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '5' ',' '&#39;&#39;' ',' 'kpView.MATCH_PHRASE' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.69' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.69' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.71' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.66' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.68' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.7' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.75' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.79' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.74' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.72' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.7' '));' 'criteria.push(new ' 'kpCriterion(&#39;rain storm&#39;, ' '1.7464053630828857,' #39'27,100'#39', '#39'49,500'#39', 27100, 49500, ' '0.6666667' ',' '&#39;$0.53&#39;, 526334,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '0' ',' '&#39;&#39;' ',' 'kpView.MATCH_BROAD' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.79' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.57' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.55' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.57' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.74' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.76' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.69' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.61' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.89' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.73' '));' 'criteria.push(new ' 'kpCriterion(&#39;[rain ' 'storm]&#39;, ' '1.7464053630828857,' #39'5,400'#39', '#39'8,100'#39', 5400, 8100, ' '0.6666667' ',' '&#39;$0.53&#39;, 526334,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '2' ',' '&#39;&#39;' ',' 'kpView.MATCH_EXACT' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.73' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.7' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.68' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.61' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.68' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.69' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.73' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.72' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.62' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.59' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.66' '));' 'criteria.push(new ' 'kpCriterion(&#39;\42rain ' 'storm\042&#39;, ' '1.7464053630828857,' #39'14,800'#39', '#39'27,100'#39', 14800, 27100, ' '0.6666667' ',' '&#39;$0.53&#39;, 526334,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '0' ',' '&#39;&#39;' ',' 'kpView.MATCH_PHRASE' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.87' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.78' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.84' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.79' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.77' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.61' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.92' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.82' '));' 'criteria.push(new ' 'kpCriterion(&#39;lightning ' 'storms&#39;, ' '1.6842896938323975,' #39'14,800'#39', '#39'27,100'#39', 14800, 27100, ' '0.73333335' ',' '&#39;$0.42&#39;, 417108,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '4' ',' '&#39;&#39;' ',' 'kpView.MATCH_BROAD' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.9' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.9' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.84' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.7' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.81' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.88' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.77' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.76' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.57' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.75' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.63' '));' 'criteria.push(new ' 'kpCriterion(&#39;[lightning ' 'storms]&#39;, ' '1.6842896938323975,' #39'3,600'#39', '#39'8,100'#39', 3600, 8100, ' '0.73333335' ',' '&#39;$0.42&#39;, 417108,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '4' ',' '&#39;&#39;' ',' 'kpView.MATCH_EXACT' ',' '0' ')); var monthlyVariation = new ' 'Array();' 'monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.8' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.86' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '1.0' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.99' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.77' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.83' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.85' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.78' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.77' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.6' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.91' ')); monthlyVariation.push(new ' 'kpMonthlyPopularity(' '0.81' '));' 'criteria.push(new ' 'kpCriterion(&#39;\42lightning ' 'storms\042&#39;, ' '1.6842896938323975,' #39'12,100'#39', '#39'22,200'#39', 12100, 22200, ' '0.73333335' ',' '&#39;$0.42&#39;, 417108,' '&#39;1 - 3&#39;, 2' ',' '0' ',' '0' ',' 'monthlyVariation,' '4' ',' '&#39;&#39;' ',' 'kpView.MATCH_PHRASE' ',' '0' ')); var monthlyVariation =

    Read the article

  • Beautifulsoup recursive attribute

    - by Marcos Placona
    Hi, trying to parse an XML with Beautifulsoup, but hit a brick wall when trying to use the "recursive" attribute with findall() I have a pretty odd xml format shown below: <?xml version="1.0"?> <catalog> <book id="bk101"> <author>Gambardella, Matthew</author> <title>XML Developer's Guide</title> <genre>Computer</genre> <price>44.95</price> <publish_date>2000-10-01</publish_date> <description>An in-depth look at creating applications with XML.</description> <catalog>true</catalog> </book> <book id="bk102"> <author>Ralls, Kim</author> <title>Midnight Rain</title> <genre>Fantasy</genre> <price>5.95</price> <publish_date>2000-12-16</publish_date> <description>A former architect battles corporate zombies, an evil sorceress, and her own childhood to become queen of the world.</description> <catalog>false</catalog> </book> </catalog> As you can see, the catalog tag repeats inside the book tag, which causes an error when I try to to something like: from BeautifulSoup import BeautifulStoneSoup as BSS catalog = "catalog.xml" def open_rss(): f = open(catalog, 'r') return f.read() def rss_parser(): rss_contents = open_rss() soup = BSS(rss_contents) items = soup.findAll('catalog', recursive=False) for item in items: print item.title.string rss_parser() As you will see, on my soup.findAll I've added recursive=false, which in theory would make it no recurse through the item found, but skip to the next one. This doesn't seem to work, as I always get the following error: File "catalog.py", line 17, in rss_parser print item.title.string AttributeError: 'NoneType' object has no attribute 'string' I'm sure I'm doing something stupid here, and would appreciate if someone could give me some help on how to solve this problem. Changing the HTML structure is not an option, this this code needs to perform well as it will potentially parse a large XML file. Thanks in advance, Marcos

    Read the article

  • Java - Highest, Lowest and Average

    - by Emily
    Hello, I've just started studying and I need help on one of my exercises. I need the end user to input a rain fall number for each month. I then need to out put the average rainfall, highest month and lowest month and the months which rainfall was above average. I keep getting the same number in the highest and lowest and I have no idea why. I am seriously pulling my hair out. Any help would be greatly appreciated. This is what I have so far: public class rainfall { /** * @param args */ public static void main(String[] args) { int[] numgroup; numgroup = new int [13]; ConsoleReader console = new ConsoleReader(); int highest; int lowest; int index; int tempVal; int minMonth; int minIndex; int maxMonth; int maxIndex; System.out.println("Welcome to Rainfall"); for(index = 1; index < 13; index = index + 1) { System.out.println("Please enter the rainfall for month " + index); tempVal = console.readInt(); while (tempVal>100 || tempVal<0) { System.out.println("The rating must be within 0...100. Try again"); tempVal = console.readInt(); } numgroup[index] = tempVal; } lowest = numgroup[0]; for(minIndex = 0; minIndex < numgroup.length; minIndex = minIndex + 1); { if (numgroup[0] < lowest) { lowest = numgroup[0]; minMonth = minIndex; } } highest = numgroup[1]; for(maxIndex = 0; maxIndex < numgroup.length; maxIndex = maxIndex + 1); { if (numgroup[1] > highest) { highest = numgroup[1]; maxMonth = maxIndex; } } System.out.println("The average monthly rainfall was "); System.out.println("The lowest monthly rainfall was month " + minIndex); System.out.println("The highest monthly rainfall was month " + maxIndex); System.out.println("Thank you for using Rainfall"); } private static ConsoleReader ConsoleReader() { return null; } } Thanks, Emily

    Read the article

  • StaX: Content not allowed in prolog

    - by RalfB
    I have the following (test) XML file below and Java code that uses StaX. I want to apply this code to a file that is about 30 GB large but with fairly small elements, so I thought StaX is a good choice. I am getting the following error: Exception in thread "main" javax.xml.stream.XMLStreamException: ParseError at [row,col]:[1,1] Message: Content is not allowed in prolog at com.sun.org.apache.xerces.internal.impl.XMLStreamReaderImpl.next(XMLStreamReaderImpl.java:598) at at.tuwien.mucke.util.xml.staxtest.StaXTest.main(StaXTest.java:18) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120) <?xml version='1.0' encoding='utf-8'?> <catalog> <book id="bk101"> <author>Gambardella, Matthew</author> <title>XML Developer's Guide</title> <price>44.95</price> <description>An in-depth look at creating applications with XML.</description> </book> <book id="bk102"> <author>Ralls, Kim</author> <title>Midnight Rain</title> <price>5.95</price> <description>A former architect battles corporate zombies, an evil sorceress, and her own childhood to become queen of the world.</description> </book> </catalog> Here the code: package xml.staxtest; import java.io.*; import javax.xml.stream.*; public class StaXTest { public static void main(String[] args) throws Exception { XMLInputFactory xif = XMLInputFactory.newInstance(); XMLStreamReader streamReader = xif.createXMLStreamReader(new FileReader("D:/Data/testFile.xml")); while(streamReader.hasNext()){ int eventType = streamReader.next(); if(eventType == XMLStreamReader.START_ELEMENT){ System.out.println(streamReader.getLocalName()); } //... more to come here later ... } } }

    Read the article

  • Help with code optimization

    - by Ockonal
    Hello, I've written a little particle system for my 2d-application. Here is raining code: // HPP ----------------------------------- struct Data { float x, y, x_speed, y_speed; int timeout; Data(); }; std::vector<Data> mData; bool mFirstTime; void processDrops(float windPower, int i); // CPP ----------------------------------- Data::Data() : x(rand()%ScreenResolutionX), y(0) , x_speed(0), y_speed(0), timeout(rand()%130) { } void Rain::processDrops(float windPower, int i) { int posX = rand() % mWindowWidth; mData[i].x = posX; mData[i].x_speed = WindPower*0.1; // WindPower is float mData[i].y_speed = Gravity*0.1; // Gravity is 9.8 * 19.2 // If that is first time, process drops randomly with window height if (mFirstTime) { mData[i].timeout = 0; mData[i].y = rand() % mWindowHeight; } else { mData[i].timeout = rand() % 130; mData[i].y = 0; } } void update(float windPower, float elapsed) { // If this is first time - create array with new Data structure objects if (mFirstTime) { for (int i=0; i < mMaxObjects; ++i) { mData.push_back(Data()); processDrops(windPower, i); } mFirstTime = false; } for (int i=0; i < mMaxObjects; i++) { // Sleep until uptime > 0 (To make drops fall with randomly timeout) if (mData[i].timeout > 0) { mData[i].timeout--; } else { // Find new x/y positions mData[i].x += mData[i].x_speed * elapsed; mData[i].y += mData[i].y_speed * elapsed; // Find new speeds mData[i].x_speed += windPower * elapsed; mData[i].y_speed += Gravity * elapsed; // Drawing here ... // If drop has been falled out of the screen if (mData[i].y > mWindowHeight) processDrops(windPower, i); } } } So the main idea is: I have some structure which consist of drop position, speed. I have a function for processing drops at some index in the vector-array. Now if that's first time of running I'm making array with max size and process it in cycle. But this code works slower that all another I have. Please, help me to optimize it. I tried to replace all int with uint16_t but I think it doesn't matter.

    Read the article

  • Hosed Windows 7 permissons

    - by Anthony
    Here is the most interesting thing I've noticed since the problems started: If I go into a control panel/system module (in this case the Resource Monitor) that has a "Check Online" type option, Firefox (my default browser) opens right up without a problem. But if I just start Firefox from any shortcuts (start menu, desktop, etc), the Firefox process starts up (and the start menu icon starts glowing) only to end without notice a few seconds later. Possibly related: If I start up in Safe-Mode (w/o Networking, but haven't tried with yet), I can start up FF or Chrome just fine, but if I attempt to open Chrome normally, I get a permissions error. Opera and Safari seem to be okay (mostly). Safari crashes when I try to download any files. All of the above leads me to believe that some (but clearly not all) core files have messed up permissions. Or rather, that I no longer have permission. System still does, based on Firefox opening without fail when the system initiates it. I've run MS Forefront once in normal mode, Malwarebytes twice in normal mode and once in safe-mode. One trojan found and deleted, but the problem persists. Two other things worth mentioning: I accidentally duplicated my library... I thought I'd try to add the "Internet" folder to my start menu, next to music and downloads. The first advanced thing I tried was "create new library". I clearly misunderstood what this means. I thought it was a way to add virtual folders to the library (which I thought, in turn, would allow me to choose it as a link on the start menu), but instead it recreated my already existing user folder, AppData and all. I didn't notice this until today. Then I tried setting permissions for my User folder to full control, recursively... Confused but not giving up,I thought I could maybe create a shortcut to the NetHood folder manually, but instead got hit with an access denied error. So I tried to change the permission levels for all sub-folders to my user folder so that I had full control. I got several access denied errors along the way. At this point I gave up, went out, ended up caught in the rain and stuck on a friend's couch and showing up late for work the next day. Thanks for nothing, Microsoft. When I finally got home today (20 hours later), I noticed that Firefox was acting really strange. I tried opening Chrome to see if the problem was client side or server side, and instead got the above-mentioned "you don't have permission to open this program" alert. And I think that's the whole story. Oh, I also did a system restore, but not chose a point from this morning (an auto update), and it worked but the problem wasn't fixed. And then all the earlier restore points were gone. So the questions are: a) is there a way to set the admin and user privs back to "default"? b) would this, in anyone's expert opinion, fix the problems I'm having? c) how come being logged in as an admin isn't the same as being logged in with admin privs? It seems that half the time I have to do run as admin for fairy standard things because i'm being treated as me-theuser and not me-theadmin. Thanks for reading.

    Read the article

  • I Hereby Resolve… (T-SQL Tuesday #14)

    - by smisner
    It’s time for another T-SQL Tuesday, hosted this month by Jen McCown (blog|twitter), on the topic of resolutions. Specifically, “what techie resolutions have you been pondering, and why?” I like that word – pondering – because I ponder a lot. And while there are many things that I do already because of my job, there are many more things that I ponder about doing…if only I had the time. Then I ponder about making time, but then it’s back to work! In 2010, I was moderately more successful in making time for things that I ponder about than I had been in years past, and I hope to continue that trend in 2011. If Jen hadn’t settled on this topic, I could keep my ponderings to myself and no one would ever know the outcome, but she’s egged me on (and everyone else that chooses to participate)! So here goes… For me, having resolve to do something means that I wouldn’t be doing that something as part of my ordinary routine. It takes extra effort to make time for it. It’s not something that I do once and check off a list, but something that I need to commit to over a period of time. So with that in mind, I hereby resolve… To Learn Something New… One of the things I love about my job is that I get to do a lot of things outside of my ordinary routine. It’s a veritable smorgasbord of opportunity! So what more could I possibly add to that list of things to do? Well, the more I learn, the more I realize I have so much more to learn. It would be much easier to remain in ignorant bliss, but I was born to learn. Constantly. (And apparently to teach, too– my father will tell you that as a small child, I had the neighborhood kids gathered together to play school – in the summer. I’m sure they loved that – but they did it!) These are some of things that I want to dedicate some time to learning this year: Spatial data. I have a good understanding of how maps in Reporting Services works, and I can cobble together a simple T-SQL spatial query, but I know I’m only scratching the surface here. Rob Farley (blog|twitter) posted interesting examples of combining maps and PivotViewer, and I think there’s so many more creative possibilities. I’ve always felt that pictures (including charts and maps) really help people get their minds wrapped around data better, and because a lot of data has a geographic aspect to it, I believe developing some expertise here will be beneficial to my work. PivotViewer. Not only is PivotViewer combined with maps a useful way to visualize data, but it’s an interesting way to work with data. If you haven’t seen it yet, check out this interactive demonstration using Netflx OData feed. According to Rob Farley, learning how to work with PivotViewer isn’t trivial. Just the type of challenge I like! Security. You’ve heard of the accidental DBA? Well, I am the accidental security person – is there a word for that role? My eyes used to glaze over when having to study about security, or  when reading anything about it. Then I had a problem long ago that no one could figure out – not even the vendor’s tech support – until I rolled up my sleeves and painstakingly worked through the myriad of potential problems to resolve a very thorny security issue. I learned a lot in the process, and have been able to share what I’ve learned with a lot of people. But I’m not convinced their eyes weren’t glazing over, too. I don’t take it personally – it’s just a very dry topic! So in addition to deepening my understanding about security, I want to find a way to make the subject as it relates to SQL Server and business intelligence more accessible and less boring. Well, there’s actually a lot more that I could put on this list, and a lot more things I have plans to do this coming year, but I run the risk of overcommitting myself. And then I wouldn’t have time… To Have Fun! My name is Stacia and I’m a workaholic. When I love what I do, it’s difficult to separate out the work time from the fun time. But there are some things that I’ve been meaning to do that aren’t related to business intelligence for which I really need to develop some resolve. And they are techie resolutions, too, in a roundabout sort of way! Photography. When my husband and I went on an extended camping trip in 2009 to Yellowstone and the Grand Tetons, I had a nice little digital camera that took decent pictures. But then I saw the gorgeous cameras that other tourists were toting around and decided I needed one too. So I bought a Nikon D90 and have started to learn to use it, but I’m definitely still in the beginning stages. I traveled so much in 2010 and worked on two book projects that I didn’t have a lot of free time to devote to it. I was very inspired by Kimberly Tripp’s (blog|twitter) and Paul Randal’s (blog|twitter) photo-adventure in Alaska, though, and plan to spend some dedicated time with my camera this year. (And hopefully before I move to Alaska – nothing set in stone yet, but we hope to move to a remote location – with Internet access – later this year!) Astronomy. I have this cool telescope, but it suffers the same fate as my camera. I have been gone too much and busy with other things that I haven’t had time to work with it. I’ll figure out how it works, and then so much time passes by that I forget how to use it. I have this crazy idea that I can actually put the camera and the telescope together for astrophotography, but I think I need to start simple by learning how to use each component individually. As long as I’m living in Las Vegas, I know I’ll have clear skies for nighttime viewing, but when we move to Alaska, we’ll be living in a rain forest. I have no idea what my opportunities will be like there – except I know that when the sky is clear, it will be far more amazing than anything I can see in Vegas – even out in the desert - because I’ll be so far away from city light pollution. I’ve been contemplating putting together a blog on these topics as I learn. As many of my fellow bloggers in the SQL Server community know, sometimes the best way to learn something is to sit down and write about it. I’m just stumped by coming up with a clever name for the new blog, which I was thinking about inaugurating with my move to Alaska. Except that I don’t know when that will be exactly, so we’ll just have to wait and see which comes first!

    Read the article

  • Master Data Management and Cloud Computing

    - by david.butler(at)oracle.com
    Cloud Computing is all the rage these days. There are many reasons why this is so. But like its predecessor, Service Oriented Architecture, it can fall on hard times if the underlying data is left unmanaged. Master Data Management is the perfect Cloud companion. It can materially increase the chances for successful Cloud initiatives. In this blog, I'll review the nature of the Cloud and show how MDM fits in.   Here's the National Institute of Standards and Technology Cloud definition: •          Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.   Cloud architectures have three main layers: applications or Software as a Service (SaaS), Platforms as a Service (PaaS), and Infrastructure as a Service (IaaS). SaaS generally refers to applications that are delivered to end-users over the Internet. Oracle CRM On Demand is an example of a SaaS application. Today there are hundreds of SaaS providers covering a wide variety of applications including Salesforce.com, Workday, and Netsuite. Oracle MDM applications are located in this layer of Oracle's On Demand enterprise Cloud platform. We call it Master Data as a Service (MDaaS). PaaS generally refers to an application deployment platform delivered as a service. They are often built on a grid computing architecture and include database and middleware. Oracle Fusion Middleware is in this category and includes the SOA and Data Integration products used to connect SaaS applications including MDM. Finally, IaaS generally refers to computing hardware (servers, storage and network) delivered as a service.  This typically includes the associated software as well: operating systems, virtualization, clustering, etc.    Cloud Computing benefits are compelling for a large number of organizations. These include significant cost savings, increased flexibility, and fast deployments. Cost advantages include paying for just what you use. This is especially critical for organizations with variable or seasonal usage. Companies don't have to invest to support peak computing periods. Costs are also more predictable and controllable. Increased agility includes access to the latest technology and experts without making significant up front investments.   While Cloud Computing is certainly very alluring with a clear value proposition, it is not without its challenges. An IDC survey of 244 IT executives/CIOs and their line-of-business (LOB) colleagues identified a number of issues:   Security - 74% identified security as an issue involving data privacy and resource access control. Integration - 61% found that it is hard to integrate Cloud Apps with in-house applications. Operational Costs - 50% are worried that On Demand will actually cost more given the impact of poor data quality on the rest of the enterprise. Compliance - 49% felt that compliance with required regulatory, legal and general industry requirements (such as PCI, HIPAA and Sarbanes-Oxley) would be a major issue. When control is lost, the ability of a provider to directly manage how and where data is deployed, used and destroyed is negatively impacted.  There are others, but I singled out these four top issues because Master Data Management, properly incorporated into a Cloud Computing infrastructure, can significantly ameliorate all of these problems. Cloud Computing can literally rain raw data across the enterprise.   According to fellow blogger, Mike Ferguson, "the fracturing of data caused by the adoption of cloud computing raises the importance of MDM in keeping disparate data synchronized."   David Linthicum, CTO Blue Mountain Labs blogs that "the lack of MDM will become more of an issue as cloud computing rises. We're moving from complex federated on-premise systems, to complex federated on-premise and cloud-delivered systems."    Left unmanaged, non-standard, inconsistent, ungoverned data with questionable quality can pollute analytical systems, increase operational costs, and reduce the ROI in Cloud and On-Premise applications. As cloud computing becomes more relevant, and more data, applications, services, and processes are moved out to cloud computing platforms, the need for MDM becomes ever more important. Oracle's MDM suite is designed to deal with all four of the above Cloud issues listed in the IDC survey.   Security - MDM manages all master data attribute privacy and resource access control issues. Integration - MDM pre-integrates Cloud Apps with each other and with On Premise applications at the data level. Operational Costs - MDM significantly reduces operational costs by increasing data quality, thereby improving enterprise business processes efficiency. Compliance - MDM, with its built in Data Governance capabilities, insures that the data is governed according to organizational standards. This facilitates rapid and accurate reporting for compliance purposes. Oracle MDM creates governed high quality master data. A unified cleansed and standardized data view is produced. The Oracle Customer Hub creates a single view of the customer. The Oracle Product Hub creates high quality product data designed to support all go-to-market processes. Oracle Supplier Hub dramatically reduces the chances of 'supplier exceptions'. Oracle Site Hub masters locations. And Oracle Hyperion Data Relationship Management masters financial reference data and manages enterprise hierarchies across operational areas from ERP to EPM and CRM to SCM. Oracle Fusion Middleware connects Cloud and On Premise applications to MDM Hubs and brings high quality master data to your enterprise business processes.   An independent analyst once said "Poor data quality is like dirt on the windshield. You may be able to drive for a long time with slowly degrading vision, but at some point, you either have to stop and clear the windshield or risk everything."  Cloud Computing has the potential to significantly degrade data quality across the enterprise over time. Deploying a Master Data Management solution prior to or in conjunction with a move to the Cloud can insure that the data flowing into the enterprise from the Cloud is clean and governed. This will in turn insure that expected returns on the investment in Cloud Computing will be realized.       Oracle MDM has proven its metal in this area and has the customers to back that up. In fact, I will be hosting a webcast on Tuesday, April 10th at 10 am PT with one of our top Cloud customers, the Church Pension Group. They have moved all mainline applications to a hosted model and use Oracle MDM to insure the master data is managed and cleansed before it is propagated to other cloud and internal systems. I invite you join Martin Hossfeld, VP, IT Operations, and Danette Patterson, Enterprise Data Manager as they review business drivers for MDM and hosted applications, how they did it, the benefits achieved, and lessons learned. You can register for this free webcast here.  Hope to see you there.

    Read the article

< Previous Page | 1 2 3 4 5  | Next Page >