By Adam Lashinsky, editor at large
SAN FRANCISCO (Fortune) -- The Wall Street Journal unleashed a firestorm last week with a page-one article titled, "Google wants its own fast track on the Web." The Journal knew this headline and the words that ran below it would be incendiary.
Google (GOOG, Fortune 500), you see, supports a badly and wonkily named policy called "network neutrality." It basically argues that everyone - and every company - should have equal access to the Internet. This is a peace-love-and-understanding concept, like advocating access to shelter, food and electricity.
In even simpler terms, Google - and most of Obama-supporting, technology-loving Silicon Valley - thinks that big, bad telephone companies, who paid to build the Internet, should charge every taker the same amount for Internet access, no matter how greedily they consume it.
Broadband, in Google's eyes, is akin to water: It's there for the taking. And if you think otherwise, well, you must be incredibly thick - and/or a phone company executive or lobbyist.
The Journal's suggestion, then, that Google was reversing itself and now supporting a "fast track," which is code language for giving some content providers preferential treatment on the Internet in return for higher payments, was certain to rile up the company that does no evil.
Oh, it was riled.
By 14 minutes after midnight on the day the Journal's story ran, Richard Whitt, Google's "Washington Media and Telecom Counsel," (translation: chief nabob of net neutrality) wrote on Google's Public Policy Blog on why the Journal was "confused."
The article, wrote Whitt, "is based on a misunderstanding of the way in which the open Internet works." (It is standard operating procedure in Washington, of course, to insinuate stupidity as a way of attacking one's critics. When the insinuator insinuates, however, it's usually a sign the bomb thrower has hit its target.)
Should information be free?
As with all wonky subjects, this one is tough to summarize. But here goes.
The Journal article disclosed for the first time that Google has approached Internet service providers about paying for a "fast lane" for its content - a proposal that, if true, strikes at the heart of the net neutrality debate. Internet service providers like AT&T (T, Fortune 500) and Verizon (VZ, Fortune 500) believe they should be able to charge bandwidth hogs like Google tiered pricing: faster access for more money, simple rates for slower access, and so on.
Google denies it's seeking preferential access. Instead, it wants to pay Internet service providers to put its computers in the same room as their computers - an industry practice known as "co-location." Doing so helps the Internet run faster - to the benefit of Google and its users.
Google insists that, by talking to ISPs about a co-location arrangement, it isn't changing course on net neutrality. No one, it says, should have to pay more for Internet access, which the company somehow distinguishes from computer location access.
What's at issue is whether Google's relationship with the phone companies was secret, whether it represents a backing off of Google's net-neutrality philosophy, and where President-elect Barack Obama actually stands on an amorphous issue to which he paid considerable lip service during the campaign.
Google has its critics. Scott Cleland, an independent research analyst and noted anti-net neutrality advocate, suggests Google got burned by its attempt to gain a competitive advantage. ISPs, after all, don't have enough physical space to accommodate every Internet company that wants to co-locate.
"Google probably would have gotten greater benefit of the doubt in the WSJ article," wrote Cleland on his blog, "had it been open and transparent in its attempt to benefit specially from smart network innovation and a free market Internet, and if the secret effort would have benefited Google's dwindling competitors as well."
The bottom line here isn't the fine points of public policy. The main thing is attitude. The Web culture thinks things should be free. Internet access is a commodity. Music videos are for the taking. (See the breakdown in talks between Warner Music (WMG) and YouTube, which is owned by Google.)
The Internet has also trained a new generation of consumers to believe information is free. Consider a particularly thoughtful article on the subject of newspapers by The New Yorker's James Surowiecki. "For a while now, readers have had the best of both worlds: all the benefits of the old, high-profit regime -intensive reporting, experienced editors, and so on - and the low costs of the new one," he wrote. "But that situation can't last. Soon enough, we're going to start getting what we pay for, and we may find out just how little that is."
It's a sentiment Google seems already to have considered - even if it won't say so.
source : http://money.cnn.com/2008/12/23/technology/internet_neutrality_lashinsky.fortune/index.htm?postversion=2008122318
Allow: /
Senin, 29 Desember 2008
Google Wants Something For Nothing
Hold your tears for Japan Inc.
By Alex Taylor III, senior editor
(Fortune) -- A fair amount of crocodile tears are being shed on this side of the Pacific over the travails of Japanese automakers.
When Toyota (TM) announced that it expected to lose money on its auto operations for the first time since 1950, it became front-page news -- even in papers outside Detroit. A good chunk of the loss is attributable to Toyota's shuttered truck plan in San Antonio, Texas, which has been the source of unconcealed glee in Detroit, so pleased are domestic automakers to see someone making mistakes besides themselves.
Then there was Honda's threat to move more production offshore if the Japanese government couldn't figure out a way to keep the yen above 100 to the dollar. A cheap yen has helped finance much of Japan's success selling cars in the U.S. and Honda's comment was tacit admission of what has long been suspected here: that the Japanese government manipulates swings in its currency to benefit its export-dependent manufacturers.
Finally, there was the surprising -- and still unconfirmed -- report out of Japan that Toyota was considering replacing president Katsuaki Watanabe because of this year's loss. Watanabe would be replaced by vice president vice president Akio Toyoda, a member of the Toyoda family that still controls the automaker despite holding only a tiny fraction of the stock.
On one hand, the move would be surprising, since Watanabe is renowned as a cost-cutter and he has moved quickly and aggressively to bring Toyota's production in line with plummeting demand. On the other, Akio Toyoda has always been seen as heir presumptive to the job of company president and his ascendancy has been a question of when, not if.
Jolting as all these developments are, they do little to undermine the long-term strength of the Japanese auto industry. While the Detroit Three will be preoccupied with questions of politics and survival over the next few months, companies like Toyota and Honda (HMC) are already laying the groundwork for future success.
For evidence, you need only keep an eye on announcements that will be coming out of the Detroit auto show next month.
Already the world leader in gas-electric hybrids, Toyota will be unveiling the third generation of its groundbreaking Prius, along with the first dedicated hybrid model to wear the Lexus badge. Toyota has promised to cut the premium for the hybrid drive in half for this model and it would be foolish to bet against it. The company is well along on its goal of selling one million hybrids a year.
In fact, Toyota will be making a whole raft of announcements aimed at solidifying its position as one the greenest auto companies on the planet. The centerpiece will be the announcement of an all-electric, battery-powered concept car, probably based on the Smart-sized iQ. Other automakers sometimes view concept cars as so much eye candy -- to titillate analysts and journalists but not to be taken seriously as business propositions. Not Toyota. Every concept has a purpose and it is seldom frivolous.
For its part, Honda is introducing the next generation Insight hybrid and expects to sell 100,000 of them in the U.S. annually. Falling gas prices will make that difficult in the short run, but the Insight should far outperform more ballyhooed efforts from General Motors (GM, Fortune 500) and Ford (F, Fortune 500), which are also introducing new hybrid models.
Toyota and Honda are making the appropriate noises about supporting the Detroit Three in their hour of need. They honestly don't want any of the companies to fail because it might invite a backlash against them, either in Washington or the salesroom floor. Heaven forbid that the United Auto Workers look toward their assembly plants in the South and try again to organize their non-union workforces.
But don't expect them to pull any punches, either. They are pursuing what they see as their individual destinies -- satisfying customer needs around the world with safe, clean, and affordable transportation.
If any competitors should fall along the way as they pursue their mission, so be it.
source : http://money.cnn.com/2008/12/23/autos/japan.inc.fortune/index.htm?postversion=2008122311
Tech's hope in 2009 - or curse?
By Jon Fortt, senior writer
SAN FRANCISCO (Fortune) -- This Christmas, the titans of the personal-computer industry are finding big lumps of coal in their stockings, and a few are grumbling that it's Intel's fault.
Of course, it's been a bad holiday season for just about everyone - the National Retail Federation expects the weakest holiday sales gains in six years - but it's particularly bad for computers.
Not only has the U.S. economy tumbled into a deep recession, but the rest of the world has fallen in too, ruining the tech industry's overseas growth story. If U.S. consumers are hesitant to drop a few hundred dollars on a new PC, how do you think buyers in developing economies like Brazil are feeling?
There is one relatively bright spot in this gloomy retail season: the "netbook," a device resembling a laptop that's been shot with one of those cartoon miniaturization guns.
The typical netbook weighs 3 pounds, has a 9-inch screen, offers a wireless Internet connection, runs Microsoft (MSFT, Fortune 500) Windows XP and has an Intel chip inside. Oh, and it costs less than $400.
I know what you're thinking. A laptop for less than $400? What's wrong with that? If you're Intel (INTC, Fortune 500), not much.
Rivals fight back
Intel is the company that probably has the most to gain since the most popular netbooks carry Intel's new Atom chip. Atom is smaller, cheaper to produce, and more power-efficient than Intel's mainstream fare, making it an ideal cornerstone for a low-cost laptop.
With this in mind, Intel encouraged the emergence of the netbook segment by selling Atom chips to upstart companies like Acer and ASUS, and allowing the resulting netbooks to be sold in tech-savvy markets in Europe and North America. (Originally, Intel planned to target poorer countries.)
Is Intel worried about cannibalizing sales of higher-end laptops? Not really. Intel executives say that, if anything, Atom-based netbooks seem to be luring buyers who otherwise would have bought laptops with low-cost chips from rival Advanced Micro Devices (AMD, Fortune 500).
But Intel customers like Hewlett-Packard (HPQ, Fortune 500) and Dell (DELL, Fortune 500) aren't so thrilled. Unlike the Taiwanese companies that are embracing netbooks, HP and Dell are frustrated with the low margins at the low end of the business, and are focused on creating clever designs and software that entice consumers to pay more - a strategy that Apple has successfully executed in the past.
"There's no money to be made at $400," one marketing executive said recently. "Consumers might be hungry for a deal, but these are not great machines."
Has the race already begun?
To the big-name brands, bare-bones Atom-based netbooks are a plague. They may be popular, but they are pushing industry heavyweights toward a price war, something they've worked hard to avoid for the last few years.
Silicon Valley executives privately talk about PC price wars as a race to the bottom, where companies vie to put out the cheapest, barely functional product while managing not to lose money. As one CEO described it: "It's like a crap-eating contest. Who wins: the one who eats the most, or the one who eats the least?"
For now, the big names are trying to eat the least. Apple (AAPL, Fortune 500) is staying away from netbooks entirely; CEO Steve Jobs has said he's unwilling to compromise quality to satisfy the bargain bin.
Todd Bradley, the chief of Hewlett-Packard's PC division, has said he's only interested in profitable growth; rather than push bare-bones netbooks, his team is trying to fast-track a premium model developed with input from designer Vivienne Tam.
Dell is selling its Inspiron Mini 9 netbook direct only, a venue that maximizes profits.
Will that work? The latest numbers released by research firm IDC show peronal-computer unit sales up but revenues down - a sign of the netbook effect. If that continues, PC makers will be tempted to bring bigger appetites to the crap-eating contest.
source : http://money.cnn.com/2008/12/24/technology/fortt_netbooks.fortune/index.htm?postversion=2008122414
Reaping repo rewards
By David Whitford, editor at large
NEW YORK (Fortune) -- As home prices continue to skid and foreclosure rates soar (up a further 38% since the third quarter of 2007), some investors are on the lookout for outrageous bargains. Think you're ready to jump in?
Consider paying cash
Some transactions require that you close within days. That's not enough time to get a bank loan. A so-called hard-money loan is an option, but you'll pay 15% plus points, and you can't count on refinancing right away.
Perform due diligence
Foreclosures are typically sold as is, where is, but you can inspect the property before you bid. Even after you put down a deposit, you can change your mind and get your money back. Private auctions typically offer a bigger window for deliberation than public auctions on the courthouse steps.
Hire a licensed appraiser
You don't care how much the lender has discounted the note; all that matters is how much the house is really worth. Only an appraiser can tell you that. Could cost you a few hundred dollars to find out. Could save you a few hundred thousand dollars once you know.
Buy short
Instead of foreclosure, Scottsdale real estate advisor Robin Reed often advises what's known as a short sale - a negotiated transaction involving you, the bank, and the homeowner prior to formal foreclosure. You'll have more time to arrange financing - and since the lender has fewer costs to recover, you may get a better price. Check public NOD (notice of default) listings for prospects.
source : http://money.cnn.com/2008/12/18/magazines/fortune/whitford_tips.fortune/index.htm?postversion=2008122410
Oracle's edge
By Michael V. Copeland, senior writer
(Fortune) -- Say what you will about Larry Ellison's style, but the in-your-face founder of Oracle knows how to manage a company through a recession, at least so far.
In an economic climate where other companies are heading for the lifeboats, Ellison is skippering Oracle into a position of strength. And it comes down to selling software that relies on a growing stream of corporate data, rather than a growing number of employees.
During a recent conference call, Ellison and his management team were practically optimistic, projecting that overall revenue for Oracle's fiscal third quarter ending in February will be up from 8% to 11% adjusting for currency exchange.
In its most recent second quarter, revenue came in below guidance, with sales growth of 6% (9% was the Street's estimate), but with operating profit margins at almost 46%, above estimates, and pointing to Oracle's ability to maintain pricing power. (Earnings were down slightly for the quarter, a slip Oracle blamed on the strengthening dollar.)
Oracle (ORCL, Fortune 500) is feeling some pain, like every other company out there, but so far it is not as acute. And when you dig into the numbers, it gives you a sense of why Oracle may offer relative safety in these uncertain economic times.
Oracle sells both applications -- human-resources software and customer-relationship software, for example -- and so-called infrastructure software. The latter includes Oracle's core database products, as well as middleware, which acts as a sort of glue between all kinds of software and services.
Applications are generally sold on a per seat basis, so revenue is based on staff size at Oracle's customers. Infrastructure software is sold based on capacity, the number of processors (CPUs) in a server running the software, for example.
The scary issue for a lot of tech companies is of course is one of headcount. As companies cut numbers to weather the recession, they are also cutting the number of seats they need for any number of applications. But companies are less likely to scale back on the efficiencies an automated enterprise can offer them, so that business is not as vulnerable.
Because it is driven by "data, not heads, the (infrastructure) segment should be more stable than other software businesses through the recession," writes Morgan Stanley analyst Adam Holt, who has an "overweight" rating on Oracle, with a 12-month price target of $22.
In that context, data versus heads (or applications versus infrastructure), investors would be wise to look at other software companies SAP and Microsoft, for example, which will be subject to the same forces.
In the second fiscal quarter, Oracle posted database and middleware revenue of almost $3 billion, up 4% year over year. During the same quarter, the applications business was flat to slightly down.
"Oracle's negative year-over-year growth in applications do not bode well for SAP," says JMP Securities analyst Patrick Walravens, who has a "market perform" rating on Oracle. SAP has a very application-heavy product offering. "Our checks so far suggest SAP has already seen some of its larger deal prospects in North America push out."
At some point the economy will recover, and headcount will once again grow. At that point, Oracle will be able to push its applications business harder. In the meantime, unlike some of its competitors, Oracle has the leverage to wring additional revenue from its infrastructure business, and sail -- as it did after the tech bubble burst -- far ahead of the pack.
source : http://money.cnn.com/2008/12/26/technology/investordaily_oracle.fortune/index.htm?postversion=2008122610
Apple stock should shine again
By Michael V. Copeland, senior writer
(Fortune) -- As reliable as a holiday fruitcake that shows up season after season, Apple's stock could once be counted on to ride a year-end price bump. The swirl of anticipation for the shiny new gadgets CEO Steve Jobs would reveal at January's annual Macworld product-fest usually gave it a lift. But not this year.
Rather than rising in this brutal fourth quarter Apple (AAPL, Fortune 500) has fallen 36% in the past three months, outrunning the 31% decline of the Nasdaq and the 29% drop of the S&P 500 in the same period.
This is clearly not a normal year for anything, and Apple's share price collapse, down 55% since the beginning of the year - coupled with the recent revelation that Jobs will not present at Macworld ever again - has caused some stalwart Apple supporters to lose faith.
But should you? That depends in part on whether you bought Apple at around $200 last year this time, or are contemplating it now at around $85.
Analysts from Goldman Sachs, Oppenheimer, Morgan Stanley among others have downgraded the stock recently. In the face of uncertain consumer demand for iPods, iPhones and most importantly Macintosh computers heading into 2009, even the most bullish Apple analysts have lowered revenue estimates for the coming year.
Following the news that Jobs was pulling out of Macworld, Oppenheimer analyst Yair Reiner, downgraded the stock to "perform" and refused to give a price target. Like others, Yair speculated on what the announcement implies about Jobs' health (Jobs has been battling cancer).
"We don't know why Steve Jobs has pulled out of his annual address at Macworld," Yair wrote in a note. "Maybe he's not feeling well, or maybe he just has nothing new to say. Whatever the reason, the unexpected announcement has underscored the greatest risk to Apple's long-term success--its dependence on Jobs' health and its apparent lack of succession plan...It's past time for Apple to either disclose the state of his health or elaborate a viable plan for eventually transferring power."
Goldman Sachs analyst David Bailey, pulled Apple from the bank's "buy" list citing lower-than-expected shipments of Apple gear in the most recent quarter and a "nearer-term outlook that is less positive." Bailey also cut his 12-month price target from $125 to $115 a share.
"It now looks unlikely that Apple will launch a new product category at Macworld in early January, taking away a potential catalyst for the shares and causing Apple to try and generate demand in a tough environment without the benefit of a new offering." The stock was downgraded from a "buy" to "neutral."
But if you didn't buy Apple back in May, at around $180 a share, when the Cupertino-based, Calif.-based company was added by Bailey to the Goldman Sachs "Conviction Buy" list (not much conviction there apparently), is it a value at less than half that price?
Leaving aside the question of Jobs' health and his possible successor, which is a big uncertainty, the underlying fundamentals of Apple are solid. Consider the $24.5 billion in cash Apple is sitting on, with no debt. That's north of $27 a share in net cash per share alone (at $95 a share). Apple's balance sheet is a monumental advantage over competitors like Dell (DELL, Fortune 500) and HP (HPQ, Fortune 500), which gives Apple both solidity and the flexibility to move quickly if necessary in this economy.
And while slightly below most estimates in the teeth of the global recession, sales of the new line of MacBook notebook computers, higher priced iPods and the 3G iPhone are still relatively strong, especially compared to other PC and smartphone vendors.
Needham analyst Charlie Wolf shrugs off the Jobs' Macworld exit. "Apple wants to get away from the tyranny of Macworld where it is forced to introduce new products on IDG's (the show's producer) schedule than its own," Wolf says, adding that his sources indicate, "Jobs is cancer free."
Much of the weakness Apple may see in iPod and Mac sales next year should be offset by iPhone revenue, Wolf predicts. Wolf has a "strong buy" rating on Apple with a lofty 12-month target price of $240, assuming 2009 revenue of $36.6 billion and a forward P/E of around 18.
That is about as a high a price target as you can find among analysts - most are clustered in the $115 to $125 range. Barclays Capital, which also has a buy rating on the stock, has a 12-month price target of $113 based on a 20X multiple of EPS in fiscal 2010 of $5.70.
"We continue to believe that Apple deserves a much higher multiple relative to both the group and the market; given it is one of the best long-term growth stories in the space," wrote Barclay's analyst Ben Reitzes in a note following the Macworld announcement.
Reitzes too, loves Apple's cash position, estimating $10 per share in free-cash-flow in fiscal 2010. "While we see a few tough quarters (ahead), the company's business model should still allow it to garner above average returns over the long-term."
Long-term seems to be the key with Apple. Does long-term mean a future at Apple without Steve Jobs? At some point yes, but does that mean Apple will stop offering the superior user experience upon which it has built its reputation and market? Not likely.
You can beat yourself up over share price moves between now and midyear, when most folks expect new products to once again come out of the Apple factory. But if you hold on long term Apple ought to bring some rewards, especially if you are getting in now.
source : http://money.cnn.com/2008/12/24/news/companies/apple.fortune/index.htm?postversion=2008122410
Highways Agency combines with Google for route 2.0
The Highways Agency is working with Google to make traffic data available for use within the Google Maps traffic feature.
The traffic feature creates a colour-coded layer of the average speeds on England's motorways and major A road network onto the existing Google Map facility with different colours indicating the current speed of traffic.
The Highways Agency provides its traffic information to Google in a Datex II format, which is a European standard developed specifically for road data information exchange. The Datex organisation provides tools to convert the data model into an XML schema.
The system provides real-time traffic information and predictions based on past conditions, helping motorists to plan their journeys ahead to avoid congestion.
ADVERTISEMENT
Denise Plumpton, director of information at the Highways Agency, says providing the data for the mash-up is a key part of the agency's information strategy designed to get traffic information out to motorists where and when they need it.
"We work regularly with third-party organisations to get our information to as wide an audience as possible," she says.
By publishing data to a wide audience, in a format in which they can use it, the Highways Agency is fulfilling its remit to help people avoid hold-ups, which in turn helps reduce congestion.
As well as reaching the Google mapping service, traffic data, provided by the agency's National Traffic Control Centre in Birmingham, is used to populate the Agency's own Traffic England website.
Highways Agency spokesman Anthony Aston says the agency would provide the same raw data to businesses wishing to integrate it with their applications as is available to Google, subject to discussion.
However, large businesses requiring detailed traffic data to integrate with internal applications can use the text-based Atlas Professional system direct from the agency.
Atlas Professional allows businesses to view only data from the areas which concerns them. This feature also enables you to create a customised RSS traffic information feed, thus providing headlines alerts for specific roads defined in "My Areas".
Meanwhile, City Timegrid is a high-level overview of the current state of the road network shown as current travel times between major destinations. In the coming months, new destinations will be added to the 14 existing choices in the City Timegrid.
The agency is gathering feedback from users before deciding which cities, event venues or transport interchanges will be included on this feature of Atlas Pro.
The agency is considering a number of future projects to build application from a variety of sources of traffic data, Aston says. There is the potential to bring in traffic data from organisation who run large fleets of vehicles, such as National Express. Satellite navigation systems could also provide a rich source of data.
"Obviously if TomTom has got 500 sat-navs on a section of the M1, there is a lot of useful information on traffic flow there," says Aston.
Data could be combined with the Highways Agency's own and republished for the benefit of businesses and all road users, Aston says.
Photos: 2008 review of the year in IT
On 11 February, Yahoo officially rebuffed Microsoft's £22 billion ($44.6 billion) takeover bid. Microsoft persevered, even raising the offer by $5bn, but it reluctantly abandoned negotiations on 3 May. The initiative would have won Microsoft a successful online services portfolio which included search and online advertising. Some argued that the move was an admission Microsoft couldn’t compete against Google in those markets on its own. Computer Weekly’s Cliff Saran wrote that Microsoft users had survived a close shave, arguing that the Yahoo deal would have stretched Microsoft too far, spelling the end for Windows.
Source : http://www.computerweekly.com/galleries/233817-1/2008-IT-year-in-pictures-Yahoo-says-no-to-Microsoft.htm
Strategic Process Control
Although the control system must be made in accordance with the specific situation, but the control system to follow the same basic process, usually following the six steps as follows:
• Determining what is manageable
• Setting standards
• Measure performance
• Comparing the performance standards
• Determining the reasons for irregularities
• Conducting action correction
Feedback from the evaluation of effectiveness of the strategy likely affect the other phase in the process of strategic management. The control system is designed both will include feedback from the control of information to individuals or groups that formed the activity restrained. Feedback system is simple to measure the output of the process and make inputs from the corrective action to obtain the desired output. The consequences of the use of feedback control system is that the sustainability performance is not satisfactory to the errors found. One technique to reduce the problems associated with the feedback control is the bait to the front. Bait system to monitor the inputs to the front in a process to determine whether entries in accordance with the plan, if not the inputs or process may be changed in order to obtain the desired results.
The second step in the process of building control is standard. Standard management is a target against which performance will be compared. Standards are the criteria that allow managers to evaluate past actions, current and future. Standard is measured in various ways, including forms of physical, quantitative and qualitative. Five aspects of performance can be managed and manageable, the quantity, quality, time, cost and behavior. Each aspect of control, may require additional grouping. General Electric to use eight types of standards, namely the profitability of the standard, the standard market position, productivity standards, product standards of leadership, standards development of human resources, employee attitudes standards and standards of public accountability.
The third step is to measure performance. Actual performance should be compared with the standard. In some places, this step is required only for the possibility of visual observation. In other situations, determination is needed carefully. Many kinds of measures to control the functions based on some form of a historical standards. For example, the standard is based on data derived from the PIMS (profit impact of market strategy), the information displayed is generally available, for example, ratings of the product / service quality, innovation rates, and the relative market shares standings. PIMS developed by Sidney Shoeffler from Harvard University in the 1960s. The size of the traditional performance of the other is the Return on Investment (ROI), which is not distinguished by a lot of Return on Assets (ROA), the results for between net income before tax in total assets. The other is the size of the traditional Return on Equity (ROE) and Earning per Share (EPS). Methods of measuring the performance of the corporation and the Division is the popular Economic Value Added (EVA) is the difference between the value of a business before and after diimplentasikan strategy. EVA is ultimately replace the ROI. The value will come from EVA is the MVA or Market Value Added, which is the difference between the market value of the company and kotribusi capital from shareholders and creditor. Companies such as General Electric, Microsoft, Intel, Coca-Cola mepunyai MVA high. Standard management practices based on strategic competitive benchmarking (presumably compete stilt), a process of measuring the performance of the company compared the performance of the best in the industry. Benchmarking company that pioneered in the United States is Xerox. Most admired companies that do benchmarking against the world's products and services that they generate.
The fourth step is to compare actual performance with the standard. If the previous steps have been running well, then compare with the standards of performance will be become easier. Although, sometimes comparing performance with standards, it is difficult to make comparisons that are needed, for example, the default behavior. Some deviations from the standard can be justified, because environmental conditions change and other reasons.
The fifth step of the process involves the excavation of the "why menyipang performance of the standard?". Because of irregularities can move towards the target of the selected organizations. Specifically, organizations need to search for irregularities that come from both internal and external changes in control of the organization. In general guidelines to help as can be seen as follows:
• What is the standard in accordance with the strategy and goals statement?
• Are the targets and strategies are still appropriate in the current environmental situation?
• What is the structure of the organization, leadership, skills of staff and system in accordance with the implementation of the strategy is successful?
• Are the activities conducted in accordance with the standards of achievement?
Point causes, both internal and external have different implications for various tidakan correction.
The next step in the process of determining the need for correction action. The manager can choose from three factors:
• Not doing anything
• Implement standard revision
• Conducting the actual performance of the correction
Do nothing can be done if a convincing performance in accordance with the standards. When the standard is not appropriate, then the manager must carefully assess the reasons why and take action correction. Furthermore, the need to check the standards periodically to ensure that the standards associated with size and performance are still relevant for the future. Finally, managers must decide action to correct the performance, when the irregularities occurred. Correction action depends on the discovery of irregularities and the ability to take important tidakan needed. Often cases of irregularities must be clear before the correction action will be taken. Because the deviations from the target range that is not realistic to the strategy that wrong.
Control of Strategic New Perspective
Control strategy, according to Schendel and Höfer focuses on two questions (1) whether the strategy is implemented as planned, and (2) whether the results made by the strategy is expected. This definition refers to the study of traditional measures and feedback that is the final step of the strategic management process. Normative model of strategic management process that describes the major steps include the formulation of strategies, implentasi strategy and evaluation (control) strategy.
Control strategy based mainly on the traditional process that involves the study and performance feedback to determine the plans, strategies and targets have been achieved with the information that is used to solve problems or take corrective action.
Control of Strategic
Control strategies focus on the way with the strategy that dimplementasikan, to detect any problems or areas of potential problem areas and make the necessary adjustments. Newman and Logan use the terminology "rudder control system" to highlight some important characteristics of the control strategy. Typically, a range of important time between the beginning of the implementation of the strategy with the achievement of results that diharapkannya. During that time, a number of projects implemented, the investment made and the actions undertaken to mengimplentasikan new strategy. Also, the internal situation and environment companies are growing and developing. Control strategy is needed to control the company through these events. Control strategy should provide some immediate correction based on the performance of the middle and new information.
Henry Mintzberg states that issue as well the organization plans to create a strategy, but that different strategies may appear. Starting with the strategies planned or expected related to several things:
• Strategy, which is expected to be realized that the strategy called intentionally (deliberate strategy)
• Strategy, which is not expected to be realized that the strategy called not realized (unrealized strategy)
• Strategy realized that never expected called an emergency strategy (emergent strategy)
Control Organization
Control of the organization consists of three types, namely the control of the strategic, management control and operational control. Control is the strategic process of evaluation strategy, the strategy is well formulated and implemented after. Control of management focus on the achievement of the targets of various substrategi consistent with the strategy and the achievement of the main targets of the medium-term plan. While the operational control based on the performance of individuals and groups are compared with the role of individuals and groups that have been determined by the plan of the organization. Each type of control is not separate and not significantly different and in fact may not be different from one with the other.
Control system to identify the operational performance standards related to the allocation and utilization of financial resources, physical factors and the factors determine the success of the facilities from the main operational control. The system of operational demands on the systematic evaluation of performance compared to standard or target that has been previously determined. It is important here is the identification and evaluation of performance irregularities, with special attention directed at determining the reasons and strategic implications of irregularities that occurred before the management react. Some companies use the trigger point and emergency plans in this process.
An alternative approach that connects the control of strategic and operational developed by Robert Kaplan and David Norton is a system called the Balanced scorecard. This approach is adapt ideas from Total Quality Management (TQM), which includes the quality of customer terdefinisikan, ongoing improvement, employee empowerment and measurement-based feedback, which is developed by adding a metodologinya financial data and the results achieved. Balanced scorecard is a management system that allows the company to make clarification strategy, move into action and provide feedback quantitative, generate core competency, the company's customer satisfaction and generate financial benefits for shareholders. Most companies are admired the world using the Balanced scorecard, for example, General Electric, Toyota Motor, Procter & Gamble, Johnson & Johnson, Apple, Berkshire Hathaway, FedEx, Microsoft, BMW, IBM, Singapore Airline, Nokia and others.
Marketing Strategy
Marketing strategy includes market segmentation and pembidikan market, product strategies, pricing strategies, the strategy places and campaign strategy. For consumer products marketing, segmentation variable is the main geographic segmentation, segmentation demografiss, psikografi segmentation, segmentation and behavior segmentation benefits.
Consumer Market Segmentation
Segmentation is a geographic division of the market into geographical units different, for example, regions, countries, states, provinces, cities, and the island. Nissan Motor MICRA do with the product's geographic market segmentation, the French market for ads using the French language with the title "SPURE", shown in the picture 7.1. Billboard ads made TBWA Paris get the Bronze Euro Effie in 2004. While the market for the UK, ads using English with the title "MODRO", shown in the picture 7.2. Billboard ads made TBWA London also received the Bronze Euro Effie in 2004.
Human Resources Strategy
Human resource development strategies include human resource planning, recruitment and selection of human resources, training and human resource development, performance evaluation, compensation and maintenance of human resources.
Human Resources Planning
Human resource planning refers to the company to identify the implications of human resources in the orgasasi and changes in the main business issues that can combine with the human resources needs that resulted from these changes and issues.
Phase in the planning of human resources include the identification of key business issues, determine the implications of human resources, to develop targets and goals of human resources and evaluate human resources.
CEO of General Electric, Jeffrey Immelt each year to provide time to around one month from one business to another business to help the planning process streategis human resources.
Financial Strategy
by : M. Suyanto
Financial strategy aimed at the use of financial resources to support business strategy, both long term and short term. This strategy includes the financial capital acquisition strategy, capital allocation, diveden allocation and management of working capital.
The acquisition of capital usually involves consideration the reasonable cost of capital, the proportion of short-term debt and long-term, the desired balance between internal and external funding, risks and limitations of ownership and the level and form of a lease that must be used. Microsoft to develop the company do shares with a sales price of 21 dollars and 28 dollars closed in 1986. Microsoft's stock price reached the peak price of 119 dollars in 1999.
Capital allocation priorities include the consideration to the allocation of capital projects, the basic selection of the end of the project and the allocation of capital is allowed without the approval of operational managers managers higher. Microsoft does not share dividend from 18 September 1987 until 16 January 2003. Capital that are used for a project to develop the Microsoft Office launched in 1989 and Windows 3.0, which is issued in 1990. In 1993 the Windows NT 3:01 and in 1995 launched Windows 95 equipped with Internet Explorer and MSN (Microsoft Network). In 1996 the Windows CE.1.0. and in 1997 launched the Internet Explorer 4.0. and in 1998 raised a Windows 98. Windows XP, released in 2001
The allocation of dividends and management of working capital include consideration of the division of the profit should be distributed as dividends, dividend stability, the form of dividends other than cash, cash flow needs, the minimum cash balance and the maximum credit policy, credit policy billing, the provisions and procedures of payment. Berkshire Hathaway more pressure on the company's net wealth rather than allocate greater dividends in the division, such as AOL Time Garnet, Exxon Mobil and Toyota.
Operating Strategy
by : m.suyanto
Operating strategy is a strategy to change the inputs (raw materials, material support, human machine) to the output of value. Operating strategies must be coordinated with marketing strategies, human resource strategies and financial strategies. Strategy relates to the operation of facilities and equipment, resources and planning and controlling operations.
Facilities and Equipment
Components of facilities and equipment related to the location of the factory, the size of factories, equipment and facilities that support the empowerment of business strategy and other functional strategies. Toyota Motor Company mendirikann factory in Texas, built in 2003 and completed in 2006, is designed to increase Toyota Production System (polling stations), with consideration as a pilot in the industrial quality, efficiency and advanced technology to achieve a new level. "The facilities we will introduce a number of production that the patent is now making cars and the level of information that must be owned right now. More important is to build a pilot for the Toyota Production System, which will make the network how many of the 21 sparate parts and components supplier can be combined and integrated in one place. Toyota Tundra's Texas plant-TMMTX will be the first car factory, which integrates the supplier, production facilities in the same place in a few things under the same roof as a car manufacturer that the main "said Don Jackson, vice president of quality and production TMMTX.
Backlink
Backlinks (or back-links (UK)) are incoming links to a website or web page. In the search engine optimization (SEO) world, the number of backlinks is one indication of the popularity or importance of that website or page (though other measures, such as PageRank, are likely to be more important). Outside of SEO, the backlinks of a webpage may be of significant personal, cultural or semantic interest: they indicate who is paying attention to that page.
n basic link terminology, a backlink is any link received by a web node (web page, directory, website, or top level domain) from another web node (Björneborn and Ingwersen, 2004). Backlinks are also known as incoming links, inbound links, inlinks, and inward links.
source : http://en.wikipedia.org/wiki/Backlink
Algorithm PageRank
PageRank is a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size. It is assumed in several research papers that the distribution is evenly divided between all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called "iterations", through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.
A probability is expressed as a numeric value between 0 and 1. A 0.5 probability is commonly expressed as a "50% chance" of something happening. Hence, a PageRank of 0.5 means there is a 50% chance that a person clicking on a random link will be directed to the document with the 0.5 PageRank.
[edit] Simplified algorithm
How PageRank Works
Assume a small universe of four web pages: A, B, C and D. The initial approximation of PageRank would be evenly divided between these four documents. Hence, each document would begin with an estimated PageRank of 0.25.
In the original form of PageRank initial values were simply 1. This meant that the sum of all pages was the total number of pages on the web. Later versions of PageRank (see the below formulas) would assume a probability distribution between 0 and 1. Here we're going to simply use a probability distribution hence the initial value of 0.25.
If pages B, C, and D each only link to A, they would each confer 0.25 PageRank to A. All PageRank PR( ) in this simplistic system would thus gather to A because all links would be pointing to A.
PR(A)= PR(B) + PR(C) + PR(D).\,
This is 0.75.
Again, suppose page B also has a link to page C, and page D has links to all three pages. The value of the link-votes is divided among all the outbound links on a page. Thus, page B gives a vote worth 0.125 to page A and a vote worth 0.125 to page C. Only one third of D's PageRank is counted for A's PageRank (approximately 0.083).
PR(A)= \frac{PR(B)}{2}+ \frac{PR(C)}{1}+ \frac{PR(D)}{3}.\,
In other words, the PageRank conferred by an outbound link L( ) is equal to the document's own PageRank score divided by the normalized number of outbound links (it is assumed that links to specific URLs only count once per document).
PR(A)= \frac{PR(B)}{L(B)}+ \frac{PR(C)}{L(C)}+ \frac{PR(D)}{L(D)}. \,
In the general case, the PageRank value for any page u can be expressed as:
PR(u) = \sum_{v \in B_u} \frac{PR(v)}{L(v)},
i.e. the PageRank value for a page u is dependent on the PageRank values for each page v out of the set Bu (this set contains all pages linking to page u), divided by the number L(v) of links from page v.
[edit] Damping factor
The PageRank theory holds that even an imaginary surfer who is randomly clicking on links will eventually stop clicking. The probability, at any step, that the person will continue is a damping factor d. Various studies have tested different damping factors, but it is generally assumed that the damping factor will be set around 0.85.[4]
The damping factor is subtracted from 1 (and in some variations of the algorithm, the result is divided by the number of documents in the collection) and this term is then added to the product of the damping factor and the sum of the incoming PageRank scores.
That is,
PR(A)= 1 - d + d \left( \frac{PR(B)}{L(B)}+ \frac{PR(C)}{L(C)}+ \frac{PR(D)}{L(D)}+\,\cdots \right)
or (N = the number of documents in collection)
PR(A)= {1 - d \over N} + d \left( \frac{PR(B)}{L(B)}+ \frac{PR(C)}{L(C)}+ \frac{PR(D)}{L(D)}+\,\cdots \right) .
So any page's PageRank is derived in large part from the PageRanks of other pages. The damping factor adjusts the derived value downward. The second formula above supports the original statement in Page and Brin's paper that "the sum of all PageRanks is one".[2] Unfortunately, however, Page and Brin gave the first formula, which has led to some confusion.
Google recalculates PageRank scores each time it crawls the Web and rebuilds its index. As Google increases the number of documents in its collection, the initial approximation of PageRank decreases for all documents.
The formula uses a model of a random surfer who gets bored after several clicks and switches to a random page. The PageRank value of a page reflects the chance that the random surfer will land on that page by clicking on a link. It can be understood as a Markov chain in which the states are pages, and the transitions are all equally probable and are the links between pages.
If a page has no links to other pages, it becomes a sink and therefore terminates the random surfing process. However, the solution is quite simple. If the random surfer arrives at a sink page, it picks another URL at random and continues surfing again.
When calculating PageRank, pages with no outbound links are assumed to link out to all other pages in the collection. Their PageRank scores are therefore divided evenly among all other pages. In other words, to be fair with pages that are not sinks, these random transitions are added to all nodes in the Web, with a residual probability of usually d = 0.85, estimated from the frequency that an average surfer uses his or her browser's bookmark feature.
So, the equation is as follows:
PR(p_i) = \frac{1-d}{N} + d \sum_{p_j \in M(p_i)} \frac{PR (p_j)}{L(p_j)}
where p1,p2,...,pN are the pages under consideration, M(pi) is the set of pages that link to pi, L(pj) is the number of outbound links on page pj, and N is the total number of pages.
The PageRank values are the entries of the dominant eigenvector of the modified adjacency matrix. This makes PageRank a particularly elegant metric: the eigenvector is
\mathbf{R} = \begin{bmatrix} PR(p_1) \\ PR(p_2) \\ \vdots \\ PR(p_N) \end{bmatrix}
where R is the solution of the equation
\mathbf{R} = \begin{bmatrix} {(1-d)/ N} \\ {(1-d) / N} \\ \vdots \\ {(1-d) / N} \end{bmatrix} + d \begin{bmatrix} \ell(p_1,p_1) & \ell(p_1,p_2) & \cdots & \ell(p_1,p_N) \\ \ell(p_2,p_1) & \ddots & & \vdots \\ \vdots & & \ell(p_i,p_j) & \\ \ell(p_N,p_1) & \cdots & & \ell(p_N,p_N) \end{bmatrix} \mathbf{R}
where the adjacency function \ell(p_i,p_j) is 0 if page pj does not link to pi, and normalised such that, for each j
\sum_{i = 1}^N \ell(p_i,p_j) = 1,
i.e. the elements of each column sum up to 1.
This is a variant of the eigenvector centrality measure used commonly in network analysis.
Because of the large eigengap of the modified adjacency matrix above, [5] the values of the PageRank eigenvector are fast to approximate (only a few iterations are needed).
As a result of Markov theory, it can be shown that the PageRank of a page is the probability of being at that page after lots of clicks. This happens to equal t − 1 where t is the expectation of the number of clicks (or random jumps) required to get from the page back to itself.
The main disadvantage is that it favors older pages, because a new page, even a very good one, will not have many links unless it is part of an existing site (a site being a densely connected set of pages, such as Wikipedia). The Google Directory (itself a derivative of the Open Directory Project) allows users to see results sorted by PageRank within categories. The Google Directory is the only service offered by Google where PageRank directly determines display order. In Google's other search services (such as its primary Web search) PageRank is used to weigh the relevance scores of pages shown in search results.
Several strategies have been proposed to accelerate the computation of PageRank.[6]
Various strategies to manipulate PageRank have been employed in concerted efforts to improve search results rankings and monetize advertising links. These strategies have severely impacted the reliability of the PageRank concept, which seeks to determine which documents are actually highly valued by the Web community.
Google is known to actively penalize link farms and other schemes designed to artificially inflate PageRank. In December 2007 Google started actively penalizing sites selling paid text links. How Google identifies link farms and other PageRank manipulation tools are among Google's trade secrets.
PageRank
PageRank is a link analysis algorithm used by the Google Internet search engine that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is also called the PageRank of E and denoted by PR(E).
The name "PageRank" is a trademark of Google, and the PageRank process has been patented (U.S. Patent 6,285,999 ). However, the patent is assigned to Stanford University and not to Google. Google has exclusive license rights on the patent from Stanford University. The university received 1.8 million shares in exchange for the patent, which were sold in 2005 for $336 million US Dollars.
source : http://en.wikipedia.org/wiki/PageRank
Sabtu, 27 Desember 2008
Gateway FX6800-01e
By Joel Santo Domingo
The Gateway FX6800-01e ($1,249.99 list) is clearly going after the gamer on a budget. It's a bit more expensive than sub-$800 gaming boxes like the , but the FX6800-01e can play today's games a whole lot better. There are a few tremendous innovations in the FX6800-01e, and its performance is stellar for its price. A couple of niggling details keep it from besting the $2,000 gaming PCs, but if you're someone who upgrades by buying a whole new system every few years, the FX6800-01e will rock your world.
The FX6800-01e is equipped with one of the new Intel Core i7-920 processors, a quad-core processor with Hyper-Threading technology. In practice, this means that the i7-920 is capable of processing up to eight streams simultaneously, a plus when you're doing multimedia work or playing multithreaded games when they are available (games are only now just starting to support dual-thread; future games will be much more multithreaded). The Core i7 family is Intel's latest set of CPUs, with the second iteration of 45nm technology (code-named Nehalem). Along with the Core i7-920, the FX6800-01e is equipped with a single 512MB ATI Radeon HD 4850 graphics card. The Radeon 4850 is ATI's most recent mainstream gaming card, so both the CPU and graphics card can be considered "gaming/performance oriented."
The system has the same chassis as the Gateway XL series, but in a different color (silver and blue as opposed to orange with bronze accents). The FX6800-01e's chassis includes a pair of externally accessible hard-drive sleds, so you can easily add up to two SATA hard drives without opening the side door. This innovative feature (unique among the cheaper gaming PCs) is similar to the hard-drive compartment in the $2,199 Acer Aspire G7700 Predator. Both have access to the outside, though the Predator holds the C: drive in one of its three sleds, while the FX6800-01e's C: drive is inside the chassis. There's also space inside for one additional hard drive, though you'll have to open the case to get to that one. You can also fit another optical drive, a PCIe x4 card, another PCIe x16 graphics card, and three more RAM sticks inside the case. As with the Predator, you can set up RAID 0, 1, 5, and 1+0 with multiple hard drives. Just make sure to use identical hard drives if you do.
The sleds themselves feel a bit flimsier than the ones in the Predator (or the internal sleds on the HP Blackbird 002 Exhilaration Edition—the best ones out there so far), but they'll do the job. I wish there were a better locking mechanism for the drives as well: There's a slide-down door on the front panel, but I like the security of locking sleds in case you need to move the system. Since I'm nit-picking, the hide-away pop-up panels for the media card reader, USB, and FireWire ports in the front and on top of the case also feel a bit cheap. The quality of plastics and doors seems to be where Gateway has cut a few corners. Again, functionality is excellent; there are just a few cost-saving measures that are readily apparent in a hands-on evaluation.
One last nit relates to the OS: Why install 64-bit Vista when the system comes with only 3GB of memory? Any benefits of 64-bit don't come into play until you have more than 4GB for RAM. I guess Gateway put Vista 64-bit on the FX6800-01E on the assumption that you'll add another 3GB to the system at some future point.
Like other Gateway computers, the FX6800-01e has some crapware on it. There's the usual 60-day trial of Microsoft Office and the 60-day trial of Norton 360. There's other stuff, like ads for eBay, Napster, and ISPs, too. I just wish there were a way to reinstall Vista without also installing all the superfluous programs. That's the best solution for a gaming box, since new graphics drivers and extra files associated with games can often make the system unstable. The best way to play a new game on an unstable gaming system is to wipe the system to bare-bones Windows, without all the performance-robbing extras.
Along with very good functionality, the system offers performance that's excellent for the price. The FX6800-01e can hold the "Can Play Crysis" banner high. Thanks to the Core i7 processor and Radeon graphics, the FX6800-01E gets a smoothly playable 58 fps on Crysis at 1,280-by-1,024, and a very smooth 760 fps on World in Conflict at the same resolution. This performance is similar to or better than that of the iBuypower Paladin 998 and the Acer Aspire G7700 Predator, both of which cost over $700 more. Out of these three, the Paladin is the only one that can play WiC at 1,920-by-1,200, and none of the three can play Crysis smoothly at that high a? resolution.
The FX6800-01E is also very speedy at multimedia tasks: 37 seconds for Windows Media Encoder and 26 seconds for Photoshop are impressive even for a quad-core system. All this makes FX6800-01e a performance bargain.
Basically, the FX6800-01e provides enough gaming power for the serious gamer, with a little bit of headroom to expand and improve 3D performance. The more expensive Paladin 998 and Predator also have enough gaming chops, and they both have a lot more room for expansion. Buy the Paladin if you have the extra $700 to spare and like to tinker and improve your gaming by buying and installing more components. Buy the Gateway FX6800-01e if you're the type who improves performance by buying a new gaming system a few years down the road, and recycles by giving the old system to your younger sibling.
64-Bit Computing Has Finally Arrived
by John Brandon
In technology, some ideas take time to germinate, none more so than 64-bit computing, where the operating system and software (including most drivers) run on a 64-bit CPU from Intel or AMD. Linux has been 64-bit for eight years, and Apple's operating system for five. But compatibility problems have dogged the 64-bit versions of Windows since its introduction in Windows XP. There are several key advantages, such as improved performance and support for many gigabytes of RAM. The real question is, why 64-bit—and why now? And, why should you care?
Let's be honest: The promise of 64-bit computing has been around for a while—some would say it's a broken promise. Yet the planets have finally aligned: Microsoft offers a 64-bit version of both Windows Vista Ultimate and Windows XP Pro, and 64-bit versions of Linux are freely available. According to Gartner, one out of every four PCs sold today comes with a 64-bit OS installed. As for hardware, both Intel and AMD have offered 64-bit processors for years. And the additional RAM supported by the wider data bus is now amazingly affordable, thanks to a streamlined manufacturing process and mainstream levels of demand.
Most important, companies such as Adobe, Apple, and Autodesk (and that's just those that start with the letter A) now offer their flagship software products in 64-bit versions. Adobe, for the first time, offers its Creative Suite 4 in a 64-bit version—currently for PC only, with a Mac version in the works.
The main benefit has to do with memory addressing. A quick lesson in processor technology: Long ago, the brilliant minds in computer science (engineers working at Intel and other companies) decided that a PC would need only a 32-bit "register size"—the amount of RAM a CPU can access. In mathematical terms, that's 232 or exactly 4GB of RAM. Back then, the high cost of memory and the absence of 64-bit software or operating systems meant that few imagined a CPU running in 64-bit mode.
Fast-forward to 2003. AMD released the first 64-bit processor, the Opteron. Suddenly, the rules changed. The CPU could access an astonishing amount of RAM: exactly 264, which translates to several million gigabytes of RAM. Not that you would install that much memory—but you could if you wanted to. Since then, operating systems and software have been slowly catching up to the hardware, and today they have (finally!).
Quad Core for the Masses
by Joel Santo Domingo
Buzz up!on Yahoo!
A thousand dollars used to denote the land of the "cheap" PC. In 2005, you'd find systems with single-core processors, 512MB of RAM and weak integrated graphics in that price range. Such a system today would go for $250 tops, less if it were running Linux. So what can you get for around $1,000 these days? A desktop PC that is likely to keep you computing for the next five to seven years before you start thinking of it as too slow. And a system with not just one or two, but four processor cores. Quad-core rigs, which were exotic two years ago when they were introduced, have filtered down into the mainstream, and good ones can be had for less than a grand. Yet quad-core still projects an air of power and competence.
At the $1,000 price point, you'll find systems that have reasonably powerful quad-core processors, 2GB to 3GB of system memory, and a fairly large hard drive (500GB is the norm). Such a system may or may not come with a discrete graphics card; even if it does, you're not going to be playing any high-end DX10 3D games on it. At a minimum, the system will certainly be enough to let you view DVDs and other downloaded or streaming videos, and it will have some internal expansion room for more powerful 3D graphics in the future if you want them. For just under a grand, the Editors' Choice HP Pavilion Elite m9400t is a well-rounded multimedia machine to be reckoned with, sporting a Blu-ray reader and a TV tuner along with more standard features. The similarly priced Gateway DX4710-UB002A counts among its attributes a 640GB hard drive and 6GB of RAM.
If you come down a bit in price, you can still get a system powerful enough to meet your multimedia needs, though you may have to make some compromises. You'll want the quad-core processor and extra RAM if you're part of the "MySpace generation" or if you're heavily into uploading to YouTube, since the extra processing power and memory will help when you convert your phonecam or camcorder footage to an uploadable, sharable format. A system like the Dell Inspiron 518 will do you well if you're hooked on video- and photo-sharing sites, though its 320GB hard drive is relatively small. And although the ZT Affinity 7225Xi lacks some of the features of its more expensive brethren, it gets the job done where it counts: in its multimedia performance scores.
Even business owners can take advantage of the quad-core power of a reasonably priced system such as the Lenovo ThinkCentre A62 for their key employees to get the job done when time is of the essence.
Whether you're a student, a budding multimedia pro, or a business owner, your $1,000 can go a long way in a desktop, toward quad-core and so much more.
Tips for Improving Traffic Alexa Rank
The webmaster of course, is that many know what 'Alexa'. For the owner of the web / blog is of course also often hear or see the word 'Alexa Rank' it. Alexa is a measure of a company's traffic ranking of a web-based system with the use of the Alexa toolbar in the period by 3 months. Ranking of a website based on the measurement of 'reach' and 'Page'. 'Reach' is measured by the amount of view of global Internet users who visit a particular site while the 'Page' is the total number of users that Alexander merequest URL of a site. Request the URL from 1 user-recurring in the same day will be counted as a single pageview.
Once you see what is the meaning of Alexa, we can conclude that Alexander Rank certainly very useful to see how Until web / blog that we manage this. However, there are also some sounds that are counter to the conclusions. They felt that the calculation ranked Alexander is not accurate and can cheat by the webmaster of that experienced in the field.
However, Alexa Rank is still needed if we want to monetize-me-a web / blog us. In my previous posting about the 'get paid to review', there are several commercial websites (when we give wage write a review), which requires Alexander Rank of the web registered in certain limits. They are more 'respect' on the website that has traffic Button Rank high.
Although my web is still quite new, because I just change the domain, but what if I shed a little knowledge that I know you all make hihiihi .. .. With based on the experience of domain2 that I have tools and guidance from some well-known blogger, in this post I will present some ways to improve traffic Alexa Rank. The steps as follows:
1. Alexander install the toolbar. Toolbar is more suitable with mozilla firefox browser and is known by the name Sparky 'The Alexa toolbar for Firefox. AddSaya already use and performansinya quite satisfactory. I can see Alexa Rank of the web in the corner of my visit down the browser Firefox. Get Sparky here!
2. Place Alexa widgets on the sidebar or the other in the website for you to know Rank Alexander and the popularity of a web link for you. As this example:
"Place interactive always-updated Alexa widgets on your site. These handy little widgets require virtually no programming skills. Just copy the html and put it on your site. "
3. Use Redirects on the link to a blog on us. Example: http://redirect.alexa.com/redirect?www.hadriyan.com. Please replace the paper I gave it a red color with a URL (address) from a web / blog for you. I often use this address to redirect every opportunity, for example, the signature on the email or siganture forum that I follow.
4. Create a posting on the related optimization Alexa Rank like I'm doing this:). Mudah2an successfully MORRIS ..
5. Create articles of quality, weight and comfortable to read, so visitors feel comfortable when visiting our website and of course to increase traffic to visit our web. I never read the articles that are interesting and may be useful for you all the way mempopularitaskan link our website, please read here.
6. Sering2lah blogwalking. Visit the web / blog owned by other people and give comments on so I can build a relationship of connection the first place while our own campaign web:)
7. Join the mailing list or forums. Do not forget to include the signature which there is a link our website.
Have a nice try and enjoy your traffic! :):)
Selasa, 23 Desember 2008
AMD: fanaticism Superkomputer x86
In mid December, the Advanced Micro Devices (AMD) announced that the processor Quad-Core AMD Opteron has been supporting the "Jaguar", Superkomputer first x86-based, which is able to achieve petaflop performance. AMD's dominance is marked with sabetan AMD Opteron processor, which now supports 7 in 10 of the system Superkomputer in the world.
Klessimone system based processor IBM PowerXCell at Los Alamos National Labs back off the first champion in the TOP500 list Superkomputer-issued twice a year. Meanwhile Superkomputer Jaguar at the Oak Ridge National Laboratory to follow in the rank-2, and based on the survey has become a x86-based systems with high performance.
Jaguar is a system based Cray XT4 and XT5, and running on the 45000 processor Quad Core AMD Opteron, x86-based systems makes it a no compare. "TOP500 list Superkomputasi today reconfirmed the leadership of AMD and the best performance that can be enjoyed HPC customers in the next few years," said Ryan Sim, ASEAN director of sales, AMD.
"7 of the 10 system of computing the most in the world currently utilize a balanced platform from Direct Connect Architecture, AMD. We are committed to the HPC community for memperkuatnya with Quad-Core processor AMD Opteron 45nm launched a new," Ryan supplement.
Recently, AMD has also announced the availability of the processor Quad-Core AMD Opteron processor is 45nm Shanghai berkodekan name. Processor Quad-Core AMD Opteron latest Drawing has the ability to handle HPC workloads, including higher in the floating point performance, memory bandwidth, more quickly run the various work and better performance.
GeCube HD4870
ATI's RV770 as far as the series is divided into two akselerator, the HD 4870 and HD 4850. Currently HD 4870 is the highest caste of the ATI graphics card for single-class chip. HD 4870 is positioned against Nvidia G92 used by the 9800 GTX, and also Nvidia GT200, especially GTX 260.
Viewing the designs, Gecube HD4870 is a design that exactly with the design of the card and cooling reference from ATI. The only identity in the GeCube graphics card is just a label GeCube inherent in the body of a graphics card.
Questions performance, this HD 4870 is very good, especially when he uses a fast GDRR5. Capable of performing beyond 9800 GTX performance, and almost the same as the GTX 260. HD 4870 graphics card is a class of high-end, which is very interesting, especially its performance can be approached even rival graphics card with a higher price. (Steven Irwandi - Contributor)
Zotac GTX 260
Design GTX 260 graphics card output Zotac is actually exactly the same with WinFast GTX 260. What distinguishes them is a sticker attached to the cooling graphics card. Moreover, the design of the card, a refrigerator, fan and a reference design from Nvidia.
Zotac GTX 260 holds exactly the same specification with the specification of the reference Nvidia. Speed standard has been very fast. Supported by the local capacity of 896MB memory, a graphics card is able to run the latest game-game.
With the performance offered, it seems reasonable that if the price offered is also quite high. Fortunately, not only the performance of graphics card offered by Zotac. There is the added value offered in the package Zotac sales GTX 260, which bundles the latest games, Race Driver: Grid. (Steven Irwandi - Contributor)
HP Compaq dx2310 MicroTower
Along with the campaign "Computer is Personal Again", HP offers a multiplicity visible incentive package to the PC and notebook market with a different target for various segments. HP Compaq dx2310 is one of them. PC from HP's latest target is the business segment that is visible from the fusion specifications reliable and affordable. Rely on the Intel Dual Core processors, the PC is dibandrol with prices around U.S. $ 532. This is the price of operating systems including Windows Vista, keyboard, and mouse, but without a monitor. If you want the whole PC, available packages that combine PC with LCD monitor.
Because the activity is intended for office, dx2310 MicroTower designed in the form of a standard and without additional accessories such as card reader. Multimedia device is also standard. In addition to the DVD-writer and audio, there are no additional FireWire connection for other purposes such as video editing. All the facilities are complete optional, so you must add more funds to get them. However, to ensure the connectivity of data is used controller class Gigabit Ethernet network.
As a PC for office purposes, dx2310 deliberately designed with the normal specifications. This is evident from the use of Dual Core E2180 processor speed of 2GHz and 1GB of DDR2 RAM, combined with the graphics card onboard Intel GMA X3100. Consequently when the test through our standard, the results indicated that not special. To run productivity applications such as Office there is no problem, but a score of 124 on the 3Dmark 2006 shows the PC could not be invited to play games weight class.
To support its performance, in the application by the dx2310 is HP's Backup and Recovery Manager for ease of doing back-up your important data. Do not worry too matter in which store data as dx2310 provides hard disk with a capacity of 160GB.
As for the desktop PC business, HP Compaq dx2310 package MT is sufficient, and we still do not recommend for gaming. Some specifications that appear similar to the PC is actually a mini Nucleom AG80, which is also our test, but the Atom processor performance clearly less quickly than the Dual Core. For normal office activity dx2310 this package we can recommend to you
Fujitsu present NAS Blade Size
Fujitsu Indonesia launch Network attached Storage (nat) PRIMERGY Blade NX650 small size and is designed as a data center in a box. Ideally this product is used for the Small and Medium Enterprises (SME) market in Indonesia.
Fujitsu Hadirkan NAS Blade Size
Fujitsu Indonesia launch Network attached Storage (nat) PRIMERGY Blade NX650 small size and is designed as a data center in a box. Ideally this product is used for the Small and Medium Enterprises (SME) market in Indonesia.
Configuring the Fujitsu unit of the NAS Blade, consists of two kinds, namely, the NX650 Blade Server and Storage Blade NX650. To use primergy blade server BX620 S4 based Quad core, 4GB of main memory, and two hot-plugged hard disk drives (HDD) 36GB.
In addition, the sub-server can be upgraded with additional processor and main memory can be up to 8GB. Meanwhile, for Storagenya, Fujitsu offers three hard drives of 146 GB, the dihot can plug and upgraded to five drives. That will generate the maximum storage capacity of 730GB. Meanwhile, for the sub-systemnya can be upgraded up to 1.4 TB.
"Creating a data center in a box to make Fujitsu able to meet the needs of local customers for storage in the blade chassis, without the additional difficulty of implementing a storage area network," said Mr. Motohiko Uno, Vice President, Platform Solutions Group, Fujitsu Asia.
Furthermore, Mr. Motohiko disclose, NX650 is inserted in the slot-slot chassis blade, enrich the features of our PRIMERGY server blade, and that it makes it ideal for SMEs.
Dell EMC Expand Strategic Alliance
Forms 2013, Dell and EMC announced the extension of the partnership's annual global alliances them. In addition to improving the partnership, Dell and EMC will also add storage systems EMC Celerra NX4 into protofolio Dell, EMC network storage systems.
Dell EMC Expand Strategic Alliance
Forms 2013, Dell and EMC announced the extension of the partnership's annual global alliances them. In addition to improving the partnership, Dell and EMC will also add storage systems EMC Celerra NX4 into protofolio Dell, EMC network storage systems.
"With the extend and expand our alliances, Dell and EMC is providing data center solutions that integrate fully to customers as a superior storage partner for customers worldwide," said Michael Dell, Chairman and CEO of Dell.
Profits from customers who obtained the cooperation between Dell and EMC have shown more than 60,000 solutions are distributed. To provide additional options to customers, EMC introduced the EMC Celerra NX4, cost-effective solutions and can help customers with the technology berkonsolidasi for NAS, iSCSI and Fiber Channel in one platform.
Dell and EMC formed their alliance in October 2001, no other objective to help customers minimize complexity and a more effective storage, manage and protect their information through the network storage.
Collaboration from Dell and EMC in 2008, this new product includes Dell | EMC CX4, the network storage system, introduced in August. Design work to incorporate the latest technology in the drive, connectivity, processing power, convenience and security.
For alliances that human occurred in the last month, when the two companies collaborate in the area of data de-duplication, one of the segments with the most rapid growth in the storage market. It also helps customers reduce costs and improve backup efficiency with a target based on de-duplication technology.
source : infokomputer.com
Remove Toshiba 512GB Capacity
Toshiba back "swing" throne disk vendors. Arsenal teranyar issued a solid state drive (SSD) 512GB, only a few days before the lapse in the big Consumer Electronics Show (CES) 2009 in Las Vegas, USA. But Toshiba will mass produce 2.5-inch SSD this quarter to two in 2009.
Despite not mention the price, Toshiba hopes 512GB SSD notebook will disandang by class and hi-end home entertainment system, including gaming. However, PS3 users seem to intend not change its hard disk with SSD, the price really affordable.
Besides the capacity to double the 2.5-inch SSD never produced any of the foregoing, Toshiba SSD generation-2 to this new 43nm technology membekal Multi-Level Cell (MLC) NAND. Not only storage space in a roomy 2.5-inch package, this new technology also allows the process to write a reading race. Sequential read speed reaches its maximum 240MBps and sequential write speed 200MBps.
Toshiba also install facilities AES data encryption to prevent illegal access to data. Another thing that can make you look SSD is the Mean Time to Fail (MTTF) 1 million hours, and the serial ATA 3.0 Gb / s.
5 Japanese companies with headquarters in three races this presumably is a major motive for the SSD. In addition to 512GB capacity, also offered a capacity of 64GB SSD, 128GB, and 256GB measuring 2.5 or 1.8 inches.
Product range and added that continues to vary (and density of the interface) is part of a strategy to encourage the growth of Toshiba SSD will demand, clearly Kiyoshi Kobayashi (VP Toshiba Corp. Semiconductor Company). Toshiba expects SSD can jar domination hard with producers seize about 10% in the notebook market in 2010 and 25% in 2012.
Intel Core i7
Intel Core i7 is a family of three Intel desktop x86-64 processors, the first processors released using the Intel Nehalem microarchitecture and the successor to the Intel Core 2 family. All three models are quad-core processors. The Core i7 identifier applies to the initial family of processors[5][6] codenamed Bloomfield. Intel representatives state that the moniker Core i7 does not have any deeper meaning. The name continues the use of the successful Core brand. Core i7, first assembled in Costa Rica, was officially launched on November 17, 2008 and is manufactured in Arizona, New Mexico and Oregon though the Oregon plant is moving to the next generation 32nm process.
Features
The Nehalem architecture has many new features, some of which are present in the Core i7. The ones that represent significant changes from the Core 2 include:
* The new LGA 1366 socket is incompatible with earlier processors.
* On-die memory controller: the memory is directly connected to the processor.
o Three channel memory: each channel can support one or two DDR3 DIMMs. Motherboards for Core i7 have four (3+1) or six DIMM slots instead of two or four, and DIMMs should be installed in sets of three, not two.
o Support for DDR3 only.
o No ECC support.
* The front side bus is replaced by QuickPath interface. Motherboards must use a chipset that supports QuickPath.
* Single-die device: all four cores, the memory controller, and all cache are on a single die.
* "Turbo Boost" technology allows all active cores to intelligently clock themselves up in steps of 133 MHz over the design clock rate as long as the CPU's predetermined thermal and electrical requirements are still met.[11]
* Re-implemented Hyper-threading. Each of the four cores can process up to two threads simultaneously, so the processor appears to the OS as eight CPUs. This feature was present in the older NetBurst architecture but was dropped in Core.
* On-die, shared, inclusive 8MB L3 cache.
* Only one QuickPath interface: not intended for multi-processor motherboards.
* 45nm process technology.
* 731M transistors.
* Sophisticated power management can place an unused core in a zero-power mode.
* Support for SSE4.2 & SSE4.1 instruction sets.
[edit] Processor cores
* The clock rates listed here are as specified by Intel for normal mode. "Turbo boost" can increase the rate on active cores in steps of 133 MHz up to a predetermined limit for short periods when required.
* The 965 XE has separate unlocked multipliers for memory and cores.
o Core clock above those in the table are not guaranteed by Intel.[2] Rates above 5GHz have been reported.
o Memory rates above those in the table are not guaranteed by Intel.[2] Rates above DDR3-2000 have been reported.
* The processor has a Thermal Design Power of 130W and will slow itself down if this power is exceeded. This feature can be disabled from an option in most of the new motherboards' BIOS.[12]
* Prices are per unit in lots of 1,000 in USD
LG BD300
The LG BD300 is a feature-packed Blu-ray player with BD-Live, a USB port for viewing multimedia, and Netflix on-demand streaming. For $350, though, we expected better-looking images. In the PC World Test Center's evaluation, our judges scored the images as Good or Very Good, but we noticed some issues. For example, one of us thought certain scenes in our test movie showed too much contrast, while another judge noted artifacts in the sky.
If the price were a little lower, or the image quality more consistent, the LG BD300 would be a knockout. As it stands, it's a solid player, and for now it remains the most affordable choice if you're determined to have your Blu-ray and your Netflix, too.
LG BD300
The LG BD300 is a feature-packed Blu-ray player with BD-Live, a USB port for viewing multimedia, and Netflix on-demand streaming. For $350, though, we expected better-looking images. In the PC World Test Center's evaluation, our judges scored the images as Good or Very Good, but we noticed some issues. For example, one of us thought certain scenes in our test movie showed too much contrast, while another judge noted artifacts in the sky.
If the price were a little lower, or the image quality more consistent, the LG BD300 would be a knockout. As it stands, it's a solid player, and for now it remains the most affordable choice if you're determined to have your Blu-ray and your Netflix, too.
Strategies to create a vision
According to Peter F. Drucker, the foundation of effective leadership is thinking vision and mission of the organization, study, and menegakkannya a clear and evident. Leaders set goals, determine priorities, and set and monitor standards.
Meanwhile, according to Tony Buzan in the book The Power of Spiritual Intelegence, vision is defined as the ability to think or plan the future with the wise and imaginative, using a mental picture of the situation that can and may occur in the future. The company's vision is the desire of companies that are ideal of carefully formulated, which determine the direction or future circumstances. The researchers see the vision as very important for the leadership, the implementation of the strategy and change (Doz & Prahalad, 1987; Hunt, 1991; Kotter, 1990; Robbins & Duncan, 1988; Sashkin, 1988).
Thus, the vision is the starting point of fact the next day a company. Vision, which is a very powerful ideas that can make the leap into the future beginning with the rally all the resources to realize the vision. Vision has the right lure and cause other people to make a commitment, generate energy and enthusiasm, to create meaning for the life of the company, to create standards that can be used to measure the success of the company, can be used outside people (customers) to measure the benefits for the company, a The main bridge between what is done now with companies that want the company in the future, is a major prerequisite for the moment and is the strategic basis for formulating the company's mission. Intel has a vision of encouraging ongoing innovation boundaries so that people can make life more fervent, more fulfilled and easier to manage. Intel's strong commitment to drive technology to make future has been the transformation of the world with a leap and bound. Intel is a company that always move in the circumstances, coal, creating an industry that never rest. Intel responses partner to develop innovative products and services, rally to support the industry can provide solutions with better resolution with collectively provide greater benefits and more quickly.
Bill Gates at the beginning of the founding of Microsoft, has the vision of "A computer on every desk in every home, running the Microsoft device." But after Bill Gates created the MS-DOS, it makes the Windows operating system as well as home use window (window) that caused Bill Gates to become Entrepreneurs richest in the world. It is not true is true that Bill Gates is merely a role in putting the PC in the office and at home in the world, but Bill Gates has the vision to see what is possible and the desire to change the vision into reality.
Thomas Watson Sr.. change the company name Computing Tabulating Recording Company became International Business Machine (IBM), even when not operate internationally, but Thomas Watson has a vision that the company later became the company that operates internationally. When Watson called the International Business Machine (IBM) many people when it mentertawakannya. Even if there is a name that Watson is too grown-blown company. But now IBM is a modern company and the manajernya become role models by setting the white shirt, plain tie, the spirit of selling out. In 2003 this company was selected as the IBM computer that the world's most spectacular version of Fortune magazine.
Jeff Bezos is the founder of Amazon.com, which initially only a bookstore .. Why does he call the company with Amazon. Because "the Amazon river is the largest river in the world" said Jeff Bezos. How the river Nile. River Nile is the longest river in the world, if compared with the volume of the water Amazon River is the only child the Amazon river. Amazon river contains 20% water world, then Jeff Bezos has on the company soon became the company which controlled 20% of the world market. Amazon.com to open the site E-Commernya in July 1995, the beginning of the book store and then spread to the compact disk (CD) and then to auction, and now thousands of different products and different traders. In 1996, to sell 15.7 million dollars and jumped into 600 million dollars in 1998. Amazon.com reported on November 1, through December 23, 2002, consumers who do order in the world 56 million items to become the best online stores 2002 version of Yahoo magazine.
John F. Welch, Jr. (former CEO) of General Electric stated: "We use the three principles set the atmosphere for the operation and behavior at General Electric: no border in all our behavior, speed in everything we do and peregangan in each of the targets that we set" . Behavior without border rally twelve global business major, each number one or number two in the market, to a laboratory in the area of the produkutamanya is a new idea, along with the general commitment to disseminate the company's history. Speed is something that does not usually found in the company of General Electric, General Electric, but in the form found in the speed of product development, design re-recycling (from order to delivery), set back the ability to reduce investment and factory equipment. Peregangan means using dreams to set targets that exceed the set target.
SOURCE : msuyanto.com
Windows 7 Beta on Hold Until 2009
By : Randall C. Kennedy, InfoWorld
Sometimes I don't know my own strength. After several painful weeks of poking holes in the Windows 7 bubble (and being poked right back by the legions of Windows zealots), it seems my message about Microsoft not doing enough to satisfy IT is finally getting through: The company has now officially delayed the release of the first public Windows 7 beta until "early 2009" -- per the company's PR firm, Waggener Edstrom.
A delay of this magnitude, hot on the heels of our scathing rebuke of the PDC pre-beta, can mean only one thing: It's running scared. Microsoft is so concerned by the overwhelming response to our groundbreaking expose, "Windows 7 unmasked," that it's pulling back on the delivery reins so that it can retool the product to address the myriad performance and compatibility issues we identified.
I, for one, applaud their honesty. Microsoft knows it's dropped the ball with Windows 7; the initial PDC build was woefully inadequate and demonstrated none of the claimed improvements in performance or resource consumption. Delaying the public beta program -- which was generally accepted to be slated for the mid-December 2008 timeframe -- is a smart move. It'll give the company a chance to take another pass at the kernel code base and maybe, just maybe, reconsider dropping some of that consumer-focused baggage.
So, it's no Windows 7 Beta for Christmas this year, kids. You'll just have to keep playing with Vista or, if you're lucky, the "Blue Badge" unlocked version of the PDC build (6801). In the meantime, check out Paul Thurrott's concerns about the Windows 7 GUI and how it's "easy" but not "simple" -- or was it the other way around?
I especially like the part where he says that Windows 7 is "in the can. It's done. There are no major changes coming." As an FOM (Friend of Microsoft) in good standing, Paul should know. He gets special access to all sorts of supersecret Microsoft stuff -- a reward for his normally glowing coverage of all things Redmond. So when even he voices his concern over an issue (the befuddling Windows 7 GUI) and follows it up by stating that the product is basically finished at this point, you know we're in trouble.
Note: So far, I've focused mostly on the kernel mode aspects of Windows 7. But since it turns out there's really nothing to see down there (it's basically Vista + some minor tweaks), I think it may be time I turned my attention to the stuff that actually has changed -- i.e., the shell glitz and other user-land components.
Anybody got a spare copy of a post-PDC build I can borrow? We asked Microsoft for a copy but the company said it "can't accommodate us" at this time (code for: you're blacklisted!)
Maybe they can just pre-install it on the FREE LAPTOP COMPUTER I requested. I'm still waiting to hear back from them on that one, but so far it doesn't look good. Oh well…maybe next year!
LG LCD, Compatible for the Outdoors
Using the notebook outside the room with the use of course different in the room. When light notebook outside the room, the light came larger, eventually the appearance of the screen was to be strong and we can not see the result of perfect reflection of the sun.
LG provides the solution. Has not been this long, LG has announced besutan anyarnya, the type of LCD screen is adequate for use outside the room. LG is the latest display technology reflective, transmissive replace, technology tools. Not only that, reflective moda will also take advantage of the sun, the aim to replace the energy source on the notebook. Moreover, they say, the energy on the notebook can be pressed to 75% with a contrast ratio of 9:1.
Unfortunately, the LG LCD besutan is available in one size only, that is 14.1 inches and is not yet available for sub-notebook with a mini-screen. In future development, also available for the expected size netbook.
(Indah PM / engadget)
LG provides the solution. Has not been this long, LG has announced besutan anyarnya, the type of LCD screen is adequate for use outside the room. LG is the latest display technology reflective, transmissive replace, technology tools. Not only that, reflective moda will also take advantage of the sun, the aim to replace the energy source on the notebook. Moreover, they say, the energy on the notebook can be pressed to 75% with a contrast ratio of 9:1.
Unfortunately, the LG LCD besutan is available in one size only, that is 14.1 inches and is not yet available for sub-notebook with a mini-screen. In future development, also available for the expected size netbook.
source : http://www.infokomputer.com/index.php/news/read/1959/LCD%20LG%20%20Kompatibel%20untuk%20Luar%20Ruangan