By Framework I mean "Overall architecture from the very bottom hardware up to high level application programming language". To make this clearer, let me describe the conventional computing framework in my way (Of course, this is not my unique idea and is well known).
As shown in the following figure. At the very bottom, there is the most fundamental electronic component called transister which is mostly in the area of physics. At next layer, a lot of electric engineers and physicists are building varios gates and based on those gates various processors and memories are built. Theoretically once we have memory and processor, we are all ready with build any program following the logic of Turing machine. On top of the hardware, fundamental software is implemented. This level of software is mainly machine language or assembly language. On top of these fundamental language, high level application languages (like C, C++, Java etc) are built.
Universal Gate Model : This model (framework) sound to me very similar to the conventional computer framework. At the very bottom, there is layer of fundamental hardware implementing Qubit (Qbit) with fundemental properties of quantum mechanics (Superposition, Entanglement). On top of it, various type of Quntum gates are defined and implemented. And on top of it, several fundamental software language is developed to build more complicated quantum circuit. But as of now (Feb 2020), the analogy with conventional computing model stops here. As far as I know, we don't have a proper model/tool to work at the similar level as high level language. I think there are two major reason for it. The technology hasn't reached to the point where we have a Quantum Computer with enough number of Qbits required for this model. Another reason is that we don't have such a diverse computing method/model to do high level task using Quantum Computer. I think IBM quantum computing model is the most well known example of this frame work.
Quantum Annealing Model : There's another type of Quantum Computing framework which is being actively promoted by D-Wave. In this model, I don't see any clear disinction between the fundamental qbit and gate layer. I think this model is also based on real Qbit, but they don't seem to build the system in such a hierachical way as in Universal Gate Model. In this model, they implemented the whole processor to solve a specific type of function (called Hamiltonion or Energy function) using a method called Quantum Anealing (I have llinked several videos about Quantum Anealing and D-Wave system in Video Link page). In this model, Programming mean converting the problem that you want to solve into the form of energy function that the Quamtum processor is designed to solve.
Universal Gate Mode would support more generic algorithms meaning that you may be able to solve more generic problem and this may be closer to the type of Quantum Computer that we eventually want to get. But usually Quatum Computer based on this model is implemented by very microscopic scale like an atomic level. It seems that the technology seems to be pretty mature to control each atomic level Qubit and maintain the status of those qubit for reasonably long period of time. But as the number of Qubit increases it gets harder and harder to build a control system to control all of those qubits as wanted and minimize the interference between those qubits and between qubits and external control circuits. As far as I googled as I am writing (Feb 2020), the largest Qbit from IBM seems to be 53.
Quatum Annelaing Model would solve only a specific form of problem. D-Wave system which is implemented based on this model implmented Qubits in macro scale which made it easier to build control system easier comparing to atomic level qubits. Partially due to this fact, D-Wave manages to implement a system with uncomparably large number of qubits. As of now (Feb 2020), the largest Qubit for D-Wave seems to be 2048(D-Wave 2000Q) and this number would become over 5000(Advantage) in mid 2020.