# Optimization Theory vs. Practice ¾«Ñ¡

ÒÑÓÐ 6792 ´ÎÔÄ¶Á 2018-6-6 03:24 |¸öÈË·ÖÀà:S and T|ÏµÍ³·ÖÀà:º£Íâ¹Û²ì

For new readers and those who request to be ¡°ºÃÓÑ good friends¡± please read my ¹«¸æÀ¸ first

I just attended a talk with the subtitle of ¡° A Case Study of Deep Optimization¡± which in my opinion could also be entitled ¡°The difference between optimization theory and practice¡±.

Here is a real world problem solved as I understand it. You can read the author¡¯s abstract of the talk below later on. In the early days of radio and TV, the government gave out various radio-TV frequency bands for stations licensed to use them. Since there were few stations and lots of frequency bands available, many stations got more band than they can use. But nowadays with nearly 3000 stations nationwide and demand from military, astronomy, and other uses, demand began to out strip supply. The Federal Communication Commission (FCC) wants to buy back unused frequency band and re-distribute channels among stations to more efficiently and fairly meet the demand and benefit society. This brings up the title of the talk ¡°Incentive Auctions and Spectrum Repacking¡±.

Here is the abstract of the author's talk:

Over 13 months in 2016--17 the US Federal Communications Commission conducted an "incentive auction" to repurpose radio spectrum from broadcast television to wireless internet. In the end, the auction yielded $19.8 billion USD,$10.05 billion USD of which was paid to 175 broadcasters for voluntarily relinquishing their licenses across 14 UHF channels. Stations that continued broadcasting were assigned potentially new channels to fit as densely as possible into the channels that remained. The government netted more than \$7 billion USD (used to pay down the national debt) after covering costs (including retuning). A crucial element of the auction design was the construction of a solver, dubbed SATFC, that determined whether sets of stations could be "repacked" in this way; it needed to run every time a station was given a price quote.

This talk describes the process by which we built SATFC and its impact on the auction. We adopted an approach we dub "deep optimization", taking a data-driven, highly parametric, and computationally intensive approach to solver design. More specifically, to build SATFC we designed software that could pair both complete and local-search SAT-encoded feasibility checking with a wide range of domain-specific techniques, such as constraint graph decomposition and novel caching mechanisms that allow for reuse of partial solutions from related, solved problems. We then used automatic algorithm configuration techniques to construct a portfolio of eight complementary algorithms to be run in parallel, aiming to achieve good performance on instances that arose in proprietary auction simulations. We found that within the short time budget required in practice, SATFC solved more than 95% of the problems it encountered. Furthermore, simulation results showed that the incentive auction paired with SATFC produced nearly optimal allocations in a restricted setting and achieved substantially better economic outcomes than other alternatives at national scale.

Speaker:  Kevin Leyton-Brown is a professor of Computer Science at the University of British Columbia and an associate member of the Vancouver School of Economics.

The real complication of the problem comes in the form of constraints. Physically nearby stations must be assigned broadcast frequency band sufficiently different to cause minimal interference. Established stations are reluctant to have to change broadcast channels due to cost. Thus, the government has to reimburse existing stations for changes and to buy back frequency band and sell them to new demands. The combinatorial possibilities quick exceeds the number of atoms in the universe. Thus what theoretically appears to a large mix integer programming problem with constraints is practically impossible to solve.  What the authors proposed is to conduct two auctions among stations for frequency band use: one for buying back bands with gradually increasing offering prices, the other for selling bands with gradually decreasing prices. The auctions are done at one minute intervals before each prices change. The objective is to realize efficiency (buying cost and selling income) and fairness while satisfying all the constraints.

I must confess I did not understand all the details of the algorithms the authors came up with. Nearest I can figure out is that the algorithm is basically a type of brute force search (taking advantage of computer speed and massive parallelism), coupled with judicious and common sense elimination of possibilities to consider, and combination of several optimizations techniques in parallel to accomplish the monstrous  real world task. The success of the project is revealed in post-mortem analysis and monetary profits realized by the government.

What my take away from this talk are:

1.     Real world optimization problems are seldom fit neatly into optimization theory.

2.     But this does not mean most optimization theories/techniques are unrealistic and not useful. Knowing the theory/technique enable the project engineers to make simplifications, combine tools, and use insightful approximations to solve problems with reasonable expectation of yielding good enough solutions.

3.     This is combining theory with practice.

Academics and researchers take note. Practice is NOT a trivial "exercise to be left to the readers"

https://wap.sciencenet.cn/blog-1565-1117595.html

ÉÏÒ»Æª£ºMy DNA analysis

Êý¾Ý¼ÓÔØÖÐ...