In this study, we propose two novel semantic language modeling techniques for spoken dialog systems. These methods are called semantic concept based language modeling and semantic structured language modeling. In the concept based language modeling, we propose to use long span semantic units to model meaning sequences in spoken utterances. In the latter technique, we use statistical semantic parsers to extract information from a sentence. This information is then utilized in a maximum entropy based language model. The language models are trained and evaluated in the air travel reservation domain. We obtain improvement over a sophisticated class based N-gram language model both in terms of recognition accuracy and perplexity. Interpolation of the proposed techniques with the class-based N-gram LM provides additional improvement.